Sort by

Article
Medicine and Pharmacology
Clinical Medicine

Masatoshi Miyamura

,

Goro Fujiki

,

Yumiko Kanzaki

,

Kosuke Tsuda

,

Hironaka Asano

,

Masaaki Hoshiga

,

Hideaki Morita

Abstract: Backgrounds: Recent advances in artificial intelligence (AI) have produced ChatGPT-4o, a multimodal large language model (LLM) capable of processing both text and image inputs. Although ChatGPT has demonstrated usefulness in medical examinations, few studies have evaluated its image analysis performance. Methods: This study compared GPT-4o and GPT-4 using public questions from the 116th–118th Japan National Medical Licensing Examinations (JNMLE), each consisting of 400 questions. Both models answered in Japanese using simple prompts, including screenshots for image-based questions. Accuracy was analyzed across essential, general, and clinical questions, with statistical comparisons by chi-square tests. Results: GPT-4o consistently outperformed GPT-4, achieving passing scores in all three examinations. In the 118th JNMLE, GPT-4o scored 457 points versus 425 for GPT-4. GPT-4o demonstrated higher accuracy for image-based questions in the 117th and 116th exams, though the difference in the 118th was not significant. For text-based questions, GPT-4o showed superior medical knowledge, clinical reasoning, and ethical response behavior, notably avoiding prohibited options. Conclusion: Overall, GPT-4o exceeded GPT-4 in both text and image domains, suggesting strong potential as a diagnostic aid and educational resource. Its balanced performance across modalities highlights its promise for integration into future medical education and clinical decision support.
Article
Physical Sciences
Mathematical Physics

Emmanuil Manousos

Abstract: In this book, we present a mathematically consistent paradigm for describing nature. Modern physics is supported by an immense body of experimental and observational data, alongside a theoretical framework that, at times, aligns with this data—and at other times, diverges from it. The absence of a clear theoretical explanation for the cause of quantum phenomena, combined with the growing mismatch between cosmological observations and theoretical predictions, suggests that a fundamental principle of nature is missing from current physical theories. The Self-Variation Theory introduces such a principle into the theoretical foundations of physics.In this work, we present the core principles and primary consequences of SVT. The theory is built upon three foundational elements:the Principle of Self-Variation,the Principle of Conservation of Energy-Momentum, and a definition of the rest mass for fundamental particles.From these principles, Self-Variation Theory leads to a number of key conclusions:it predicts a specific internal structure of particles that extends across all distance scales,it provides a unified explanation for particle interactions,it accounts for both cosmological data and quantum phenomena, offering a coherent framework that connects them. The theory's predictions regarding the origin, evolution, and current state of the universe are in agreement with available observational evidence. From subatomic scales to astronomical distances spanning billions of light-years, Self-Variation Theory demonstrates a remarkable consistency with experimental and observational data. The structure of the book has been carefully designed to ensure the necessary clarity and precision in presenting the theory. A sequence of interconnected derivations begins with the fundamental principles, proceeds through their synthesis, and culminates in the field equations of the theory—applicable across all distance scales.
Review
Biology and Life Sciences
Animal Science, Veterinary Science and Zoology

Eiji Iwazaki

,

Akihiro Mori

Abstract: Regular health screenings has not yet been widely recognized among pet owners. Thus, we aimed to establish clinical health diagnostic criteria for cats and develop objective, easy-to-use methods for calculating the obesity index and assessing body composition. This review focuses on body composition measurement techniques and introduces evaluation methods for animals. The Body and Muscle Condition Score were used to assess nutritional status. Although easy to measure, these techniques are subjective and dependent on the operator’s skills. Although objective methods for assessing obe-sity and body fat percentage, such as several body mass index and bioelectrical im-pedance analysis, have been established, they have not been widely adopted owing to complexity. The gold standard for body composition evaluation includes several tech-niques; however, their invasiveness and cost make them challenging to use in clinical settings. Consequently, we established a method for assessing body fat percentage and muscle mass using feline body mass index (fBMI) based on skeletal zoometry meas-urements, subcutaneous fat and specific muscle thickness measured by ultrasonogra-phy was established. These techniques are objective obesity evaluation methods that can be easily and stably used in clinical veterinary practice.
Article
Engineering
Other

Victor Hugo Garcia Ortega

,

Josefina Barcenas Lopez

,

Enrique Ruiz-Velasco Sanchez

Abstract: Laboratories across educational levels have traditionally required in-person attendance, limiting practical activities to specific times and physical spaces. This paper presents a technological architecture based on a system-on-chip (SoC) and a connectivist model, grounded in Connectivism Learning Theory, for implementing a remote laboratory in digital logic design using FPGA devices. The architecture leverages an Internet of Things (IoT) environment to provide applications and servers that enable remote access, programming, manipulation, and visualization of FPGA-based development boards located in the institution’s laboratory, from anywhere and at any time. The connectivist model allows learners to interact with multiple nodes for attending synchronous classes, performing laboratory exercises, managing the remote laboratory, and accessing educational resources asynchronously. This approach aims to enhance learning, knowledge transfer, and skills development. A four-year evaluation was conducted, including one experimental group using an e-learning approach and three in-person control groups from a Digital Logic Design course. The experimental group achieved an average performance score of 9.777, surpassing the control groups, suggesting improved academic outcomes with the proposed system. Additionally, a Technology Acceptance Model–based survey showed very high acceptance among learners. This paper presents a novel connectivist model, which we have called the Massive Open Online Laboratory.
Article
Public Health and Healthcare
Public Health and Health Services

Uwem Friday Ekpo

,

Solomon Monday Jacob

,

Hammed Mogaji

,

Francisca Olamiju

,

Fajana Oyinlola

,

Ijeoma Achu

,

Olanike O. Oladipupo

,

Alice Y. Kehinde

,

Imaobong O Umah

,

Fatai Oyediran

+2 authors

Abstract: As Nigeria advances toward the elimination of soil-transmitted helminthiasis (STH), updated endemicity maps are essential for guiding programmatic decisions. Here, we conducted a cross-sectional study from July to August 2024 in ten local government areas (LGAs) of Ondo State to update the STH endemicity maps. LGAs were stratified into three categories (C1-C3) based on the history of preventive chemotherapy (PC), with C1 being endemic LGAs with ≥ 5 effective rounds of PC, C2 being endemic LGAs with < 5 effective rounds of PC, and C3 being low-endemicity (STH prevalence < 20%; PC not required). A total of 4507 school-aged children (5–14 years) from 151 systematically selected communities (15 per LGA) provided fresh stool samples to assess the prevalence and intensity of STH. Stool samples were examined using the Kato-Katz technique. Prevalence of STH was aggregated at the LGA level and compared with World Health Organization thresholds. In the first category (C1), the baseline prevalence was reduced significantly by 60-96%, with specific prevalence in Akoko Southwest (from 28.2% to 0.4%, Risk Ratio (RR): 0.01), Akure North (from 39% to 1.5%, RR=0.04), Ifedore (from 25% to 2.5%, RR= 0.10), and Ondo East(from 45.2% to 8.2%, RR= 0.18). In the second category (C2), the baseline was reduced significantly by 66-100%, with Akure South (from 29% to 1.2%, RR= 0.04), Ose (from 20% to 2.2%, RR= 0.11), Owo (~100% reduction), and Odigbo (38% to 12.8%, RR= 0.34). In the C3 LGAs, infection was significantly below the baseline threshold, with Akoko Northwest (5.2% to 0.9%, RR=0.17) and Idanre (from 14.2% to 1.8%, RR=0.13). Overall, significant reductions in STH prevalence were observed across the surveyed LGAs, with risk ratios ranging from 0.04 to 0.40. These findings updated the endemicity map for the ten LGAs in Ondo State, demonstrating significant progress toward STH elimination following PC implementation.
Article
Computer Science and Mathematics
Software

Dong Liu

Abstract: This paper introduces Primary Breadth-First Development (PBFD) and Primary Depth-First Development (PDFD)—formally and empirically verified methodologies for scalable, industrial-grade full-stack software engineering. Both approaches enforce structural and behavioral correctness through graph-theoretic modeling, bridging formal methods and real-world practice.PBFD and PDFD model software development as layered directed graphs with unified state machines, verified using Communicating Sequential Processes (CSP) and Linear Temporal Logic (LTL). This guarantees bounded-refinement termination, deadlock freedom, and structural completeness.To manage hierarchical data at scale, we present the Three-Level Encapsulation (TLE)—a novel bitmask-based encoding scheme. TLE operations are verified via CSP failures-divergences refinement, ensuring constant-time updates and compact storage that underpin PBFD's robust performance.PBFD demonstrates exceptional industrial viability through eight years of enterprise deployment with zero critical failures, achieving approximately 20× faster development than Salesforce OmniScript, 7–8× faster query performance, and 11.7× storage reduction compared to conventional relational models. These results are established through longitudinal observational studies, quasi-experimental runtime comparisons, and controlled schema-level experiments.Open-source Minimum Viable Product implementations validate key behavioral properties, including bounded refinement and constant-time bitmask operations, under reproducible conditions. All implementations, formal specifications, and non-proprietary datasets are publicly available.
Article
Computer Science and Mathematics
Algebra and Number Theory

Weicun Zhang

Abstract: The Extended, Generalized, and Grand Riemann Hypotheses are proved under a unified framework, which is based on the divisibility of entire functions expressed as absolutely convergent infinite products of polynomial factors, where the uniqueness of zero multiplicities plays a critical role. Consequently, the existence of Landau-Siegel zeros is excluded, thereby confirming the Landau-Siegel zeros conjecture.
Review
Public Health and Healthcare
Public, Environmental and Occupational Health

Rosana González Combarros

,

Mariano González-García

,

Gerardo David Blanco-Díaz

,

Kharla Segovia-Bravo

,

José Luis Reino Moya

,

José Ignacio López-Sánchez

Abstract:

Over the last 15 years, mixture risk assessment for food xenobiotics has evolved from conceptual discussions and simple screening tools, such as the Hazard Index (HI), towards operational, component-based and probabilistic frameworks embedded in major food-safety institutions. This review synthesizes methodological and regulatory advances in cumulative risk assessment for dietary “cocktails” of pesticides, contaminants and other xenobiotics, with a specific focus on food-relevant exposure scenarios. At the toxicological level, the field is now anchored in concentration/dose addition as the default model for similarly acting chemicals, supported by extensive experimental evidence that most environmental mixtures behave approximately dose-additively at low effect levels. Building on this paradigm, a portfolio of quantitative metrics has been developed to operationalize component-based mixture assessment: HI as a conservative screening anchor; Relative Potency Factors (RPF) and Toxic Equivalents (TEQ) to express doses within cumulative assessment groups; the Maximum Cumulative Ratio (MCR) to diagnose whether risk is dominated by one or several components; and the combined Margin of Exposure (MOET) as a point-of-departure–based integrator that avoids compounding uncertainty factors. Regulatory frameworks developed by EFSA, the U.S. EPA and FAO/WHO converge on tiered assessment schemes, biologically informed grouping of chemicals and dose addition as the default model for similarly acting substances, while differing in scope, data infrastructure and legal embedding. Implementation in food safety critically depends on robust exposure data streams. Total Diet Studies provide population-level, “as eaten” exposure estimates through harmonized food-list construction, home-style preparation and composite sampling, and are increasingly combined with conventional monitoring. In parallel, human biomonitoring quantifies internal exposure to diet-related xenobiotics such as PFAS, phthalates, bisphenols and mycotoxins, embedding mixture assessment within a dietary-exposome perspective. Across these developments, structured uncertainty analysis and decision-oriented communication have become indispensable. By integrating advances in toxicology, exposure science and regulatory practice, this review outlines a coherent, tiered and uncertainty-aware framework for assessing real-world dietary mixtures of xenobiotics, and identifies priorities for future work, including mechanistically and data-driven grouping strategies, expanded use of physiologically based pharmacokinetic modelling and refined mixture-sensitive indicators to support public-health decision-making.

Article
Computer Science and Mathematics
Computer Vision and Graphics

Lin Huang

,

Xubin Ren

,

Daiming Qu

,

Lanhua Li

,

Jing Xu

Abstract: Optical Fiber Composite Overhead Ground Wire (OPGW) cables serve dual functions in power systems, lightning protection and critical communication infrastructure for real-time grid monitoring. Accurate OPGW identification during UAV inspections is essential to prevent miscuts and maintain powercommunication functionality. However, detecting small, twisted OPGW segments among visually similar ground wires is challenging, particularly given the computational and energy constraints of edge-based UAV platforms. We propose OPGW-DETR, a lightweight detector based on the D-FINE framework, optimized for low-power operation to enable reliable onboard detection. The model incorporates two key innovations: multi-scale convolutional global average pooling (MC-GAP), which fuses spatial features across multiple receptive fields and integrates frequency-domain information for enhanced fine-grained representation, and a hybrid gating mechanism that dynamically balances global and spatial features while preserving original information through residual connections. By enabling real-time onboard inference with minimal energy consumption, OPGW-DETR addresses UAV battery and bandwidth limitations while ensuring continuous detection capability. Evaluated on a custom OPGW dataset, the S-scale model achieves 3.9% improvement in average precision (AP) and 2.5% improvement in AP50 over the baseline. These gains enhance communication reliability by reducing misidentification risks, enabling uninterrupted grid monitoring in low-power UAV inspection scenarios where accurate detection is critical for communication integrity and grid safety.
Article
Environmental and Earth Sciences
Oceanography

Carl L. Amos

,

Hachem Kassem

,

Victoriano Martinez-Alvarez

,

Thamer Al Rashidi

Abstract: The Mar Menor is the second largest coastal lagoon in the Mediterranean Sea with a surface area of about 136 km2. It is restricted from the open sea by a sandy barrier system (La Manga) interrupted by three tidal inlets. As a result of high evaporation it is hypersaline (42-47 ppt) in parts. This study examines the factors leading to the rise in sea surface temperature in the Mar Menor through the analysis of long-term sea surface temperature using HadSST1.1 data together with shorter term MODIS and OISST data. A thermal box model has been constructed for the lagoon in an attempt to balance major heat sources and sinks. As well, a thermal probe was deployed in 0.3 m of water to evaluate the benthic flux of heat of the shelly fine sand that covers the lagoon seabed. Results show that the vertical thermal gradient in the seabed inverts between the day and night. Prior to 1980 there was no clear trend in SST and variations were strongly associated with the AMO and NAO. Post 1980, maximum summertime SST showed a steady increase of 0.34°C/decade. Cross-correlation of SST in the Mar Menor with external drivers showed that it is dominated by SST of the Western Mediterranean, followed by CO2, AMO and IOD. There was a strong inverse relationship with sun spot activity and the Spanish national GDP. There were no significant links in trends between SST in the Mar Menor and PDO, NAO or ENSO3,4 in a Spearman Rank order evaluation and PCA analysis.
Article
Engineering
Electrical and Electronic Engineering

Shitikantha Dash

,

Dikshit Chauhan

,

Dipti Srinivasan

Abstract: A sustainable city requires a sustainable means of transportation. This ambition is leading towards a higher penetration of electric vehicles (EVs) in our cities, in both the private and commercial sectors, putting more and more burden on the existing power grid. Modern deregulated power grids vary electricity tariffs from location to location and from time to time, to compensate for any additional burden. In this paper, we propose a profit-aware solution to strategically manage the movements of EVs in the city to support the grid while exploiting these locational, time-varying prices. This work is divided into three parts: M1) Profit-aware charging location and optimal route selection, M2) Profit-aware charging &amp; discharging location and optimal route selection, and M2b) Profit-aware charging &amp; discharging location and optimal route selection considering the demand-side flexibility. This work is tested on the MATLAB programming platform using the Gurobi optimisation solver. From the extensive case study, it is found that M1 can yield profits up to 2 times more than those of its competitors, whereas M2 can achieve profits up to 2.5 times higher and simultaneously provide substantial grid support. Additionally, M2b extension has made M2 more efficient in terms of grid support.
Review
Engineering
Mechanical Engineering

Aswin Karakadakattil

Abstract: Metal additive manufacturing (AM) has emerged as a transformative route for producing lightweight, high-precision, and geometrically complex components in aerospace, biomedical, and microelectronic sectors. Among AM technologies, Laser Powder Bed Fusion (LPBF) offers exceptional design freedom; however, its widespread adoption particularly for titanium alloys remains constrained by two persistent challenges: shrinkage-induced dimensional deviation and porosity-related performance loss. In LPBF-processed Ti-6Al-4V, residual linear deviation typically falls within 0.1–0.8% when geometric compensation, preheating, and support strategies are implemented, while raw, uncompensated shrinkage is more commonly reported in the range of 1.2–2.0%, especially for thin-wall or thermally constrained geometries. Volumetric contraction (approximately 2–6%) may remain significant depending on part architecture and localized thermal accumulation. Concurrently, gas-induced and lack-of-fusion pores continue to undermine fatigue resistance and dimensional reliability. Research into process optimization, thermal management, and post-processing such as Hot Isostatic Pressing (HIP), vacuum sintering, and stress-relief annealing has improved density and mechanical integrity, while recent developments in AI-assisted monitoring, physics-informed models, and digital-twin frameworks are redefining defect prediction and control. Drawing on more than 100 peer-reviewed studies, this review synthesizes mechanism-driven insights and outlines a forward-looking roadmap, demonstrating how hybrid processing, real-time sensing, and data-centric control collectively advance the pathway toward defect-minimized, industrial-scale manufacturing of titanium components.
Article
Physical Sciences
Mathematical Physics

Edward Bormashenko

,

Shraga Shoval

,

Ramita Sarkar

Abstract:

We introduce a new combinatorial framework for classical mechanics - the Ramsey -Hamiltonian approach - which interprets Poisson-bracket relations through the lens of finite and infinite Ramsey theory. Classical Hamiltonian mechanics is built upon the algebraic structure of Poisson brackets, which encode dynamical couplings, symmetries, and conservation laws. We reinterpret this structure as a bi-colored complete graph, whose vertices represent phase-space observables and whose edges are colored gold or silver according to whether the corresponding Poisson bracket vanishes or not. Because Poisson brackets are invariant under canonical transformations (including their centrally extended Galilean form), the induced graph coloring is itself a canonical invariant. Applying Ramsey theory to this graph yields a universal structural result: any six observables necessarily form at least one monochromatic triangle, independent of the Hamiltonian’s specific form. Gold triangles correspond to mutually commuting (Liouville-compatible) observables that generate Abelian symmetry subalgebras, whereas silver triangles correspond to fully interacting triplets of dynamical quantities. When the Hamiltonian is included as a vertex, the resulting Hamilton–Poisson graphs provide a direct graphical interpretation of Noether symmetries, cyclic coordinates, and conserved quantities through star-like subgraphs centered on the Hamiltonian. We further extend the framework to Hamiltonian systems with countably infinite degrees of freedom - such as vibrating strings or field-theoretic systems - where the infinite Ramsey theorem guarantees the existence of infinite monochromatic cliques of observables. Finally, we introduce Shannon-type entropy measures to quantify structural order in Hamilton–Poisson graphs through the distribution of monochromatic polygons. The Ramsey–Hamiltonian approach offers a novel, symmetry-preserving, and fully combinatorial reinterpretation of classical mechanics, revealing universal dynamical patterns that must occur in every Hamiltonian system regardless of its detailed structure.

Article
Biology and Life Sciences
Biophysics

Pavel Straňák

Abstract: The emergence and persistence of life pose a profound paradox: abiogenesis appears statistically almost impossible under standard physical chemistry, yet once present, living systems exhibit remarkable long-term stability against entropic decay. Here we propose that both phenomena can be explained by the action of a hitherto unobserved informational reservoir that subtly “leaks” into biological systems, biasing microstate probabilities in real time. While quantum coherence and nonlocality currently represent the most plausible physical substrates, the hypothesis deliberately remains agnostic about the ultimate origin of this reservoir. Crucially, the transfer need not be intentional; it may constitute an unintended “crosstalk” across an ontological boundary—analogous to sound leaking through a wall between apartments. This framework offers a strictly naturalistic alternative to intelligent design theories while generating falsifiable predictions distinguishable from both pure chance and directed panspermia.
Article
Biology and Life Sciences
Other

Cinthia Jael Gaxiola-Calvo

,

Diana Fimbres-Olivarría

,

Ricardo Iván González-Vega

,

Yaeel Isbeth Cornejo-Ramírez

,

Ariadna Thalía Bernal-Mercado

,

Saul Ruiz-Cruz

,

José de Jesús Ornelas-Paz

,

Miguel Ángel Robles-García

,

José Rogelio Ramos-Enríquez

,

Carmen Lizette Del-Toro-Sánchez

Abstract: Blood groups in the ABO system and the RhD factor is of great clinical importance, as it is related to susceptibility to various diseases caused by oxidative stress. The use of antioxidants such as C-phycocyanin (a phycobiliprotein) could be an alternative to mit-igate oxidative stress in the blood. Therefore, the objective of this study is to evaluate the antioxidant and erythroprotective activity of C-phycocyanin (C-PC) from Spirulina sp. against oxidative stress caused by peroxyl radicals, before and after in vitro digestion, comparing susceptibilities between blood groups. C-phycocyanin from Spirulina sp. was obtained commercially. The antioxidant capacity by ABTS+•, DPPH•, and FRAP assays of the bioaccessible fraction of C-PC increased compared to baseline in all assays. Samples appear to have high hydrogen atom transfer. C-PC is not cytotoxic in most blood groups. The AAPH hemolysis assays showed differences between blood groups, yielding results of 27.90, 22.60, 26.94, 27.66, 28.16, 28.34, and 24.91% hemolysis for O+, O-, A+, A-, B+, AB+, and AB-, respectively. Furthermore, in vitro digestion increased the erythropro-tective effect in the bioavailable fraction in most blood groups, showing 37.12, 80.13, 5.48, 92.38, 67.93, 80.30, and 76.49% inhibition of hemolysis in O+, O-, A+, A-, B+, AB+, and AB-, respectively. These results demonstrate the biotechnological and biomedical po-tential of phycobiliproteins as safe candidates for the development of nutraceuticals and functional foods aimed at preventing oxidative damage.
Case Report
Medicine and Pharmacology
Psychiatry and Mental Health

Ngo Cheung

Abstract: Off-label use of glutamatergic agents is increasingly common in psychiatry, yet standardized protocols for outpatient dosing are lacking. This report describes the pharmacological management and dosing adjustments required for three patients receiving dextromethorphan (DXM) and piracetam for obsessive–compulsive disorder.Three adult women with severe OCD were treated in a routine clinical setting. Treatment history varied from naïve to treatment-resistant. All patients commenced treatment with a nighttime regimen of oral DXM and piracetam to minimize potential side effects while maintaining existing psychotropic regimens.One patient achieved full remission on a once-nightly regimen. The last two patients showed a "wearing-off" effect, that their symptoms got better quickly after they woke up but then came back in the late afternoon, which neccesitated the schedule to be changed from once a day to twice a day (b.i.d.). This change fixed the afternoon symptom breakthrough without needing to raise the dose.Experience with these cases suggests that while bedtime administration is a safe starting point for routine care, the half-life of the agents may necessitate split dosing for some individuals. The observation that simple schedule adjustments can resolve diurnal symptom fluctuation provides a practical insight for psychiatrists managing OCD with glutamatergic augmentation.
Article
Public Health and Healthcare
Other

Jingwen Cai

,

Caroline Dudish

,

Amani Mouna

,

Angelena Jacob

,

Wesley James

,

Douglas Dickinson

,

Hongfang Yu

,

Yutao Liu

,

Ashish K Sarker

,

Mustafa Culha

+3 authors

Abstract: Nutraceuticals such as curcumin, resveratrol, lycopene, lutein, and coenzyme Q10 pos-sess strong antioxidant and anti-inflammatory activities but their practical use is hin-dered by poor solubility and bioavailability. Traditional nanocarriers like liposomes, nanoemulsions, and polymeric nanoparticles often rely on surfactants and synthetic or-ganic solvents that limit safety, scalability, and regulatory acceptance. The present study evaluated the Facilitated Self-Assembling Technology (FAST) platform as a clean-label al-ternative for generating bioavailable nutraceutical nanoparticles. Using only food-grade facilitating medium, FAST enabled spontaneous formation of stable, amorphous nano-particles with strong negative surface charge and high colloidal stability. Hybrid nano-particles combining epigallocatechin-3-gallate-palmitates (EC16), curcumin, and resvera-trol further improved surface charge, reduced size range, and exhibited enhanced stability under simulated gastric conditions. All formulations demonstrated excellent biocompati-bility in XTT assays, with no reduction in viability compared to control. Fluorescent im-aging of EC16/Cy5 fluorescent hybrid nanoparticles confirmed nanoparticle–cell surface interactions without cytotoxicity. Compared with chemical conjugation and lipid-based nanoencapsulation, FAST offered faster, surfactant-free, and energy-efficient production, fully compliant with FDA generally recognized as safe (GRAS) standards. These results support the FAST platform as an efficient, economical, and scalable nanotechnology for next-generation functional beverages and oral nutraceutical delivery systems that meet both regulatory and consumer demands for natural, sustainable innovation.
Article
Physical Sciences
Theoretical Physics

Satya Seshavatharam U.V

,

Lakshminarayana S

,

Gunavardhana Naidu T

Abstract: In the framework of the recently proposed 4G model of final unification, integrating three large atomic gravitational constants corresponding to the electromagnetic, strong, and electroweak interactions, we explore the physical existence of a fundamental electroweak fermion of rest energy 585 GeV. This particle is envisioned as the “zygote” of all elementary fermions and as the weak‐field counterpart to photons and gluons. Using three core assumptions and five defining relations, the model quantitatively reproduces key nuclear and particle physics observables, including the strong coupling constant, nuclear binding energies, neutron lifetime, charge radii, and several dimensionless large numbers. Theoretical string tensions and energies are derived for each atomic interaction (weak, strong, electromagnetic) using experimentally relevant scales (GeV–MeV–eV) rather than the inaccessible Planck scale, thus extending string theory’s applicability to testable low‑energy domains. Comparative analysis (Tables 1 and 2) demonstrates close agreement between calculated string energies and known interaction energies, providing a bridge between quantum gravity concepts and measurable nuclear data. The model also predicts possible astrophysical signatures of the 585 GeV fermion through annihilation and acceleration processes capable of generating TeV–multi‑TeV photons. A neutral fermion of 585 GeV seems to be in line with the recent Fermi-LAT gamma-ray excesses in the Milky Way halo. While the approach is qualitative in some mathematical details, its ability to fit fundamental constants and nuclear properties within a unified string–gravitational paradigm offers a promising, experimentally approachable route toward a physically grounded final unification theory. Additionally, our 4G model assumes a charged electroweak fermion with a mass of 585 GeV/c2 , intriguingly close to half the mass of the neutral supersymmetric Higgsino, estimated to lie between 1.1 and 1.2 TeV/c2. This numerical proximity reinforces the model’s alignment with leading theories of dark matter and supersymmetry, highlighting the charged fermion’s potential role as a fundamental building block within the electroweak sector. Such correspondence provides a compelling avenue for experimental searches and deeper theoretical investigations bridging nuclear physics and particle phenomenology.
Article
Physical Sciences
Mathematical Physics

Hyoung-In Lee

,

Sang-Hyeon Kim

,

Tae-Yeon Kim

,

Hee-Jeong Moon

Abstract: The structural vibration of industrial droplet dispensers can be modeled by telegraph-like equations to a good approximation. We reinterpret the telegraph equation from the standpoint of an electric-circuit system consisting of an inductor and a resistor, which is in interaction with an environment, say, a substrate. This interaction takes place through a capacitor and a shunt resistor. Such interactions serve as leakage. We have performed analytical investigation of the frequency dispersion of telegraph equations over unbounded one-dimensional domain. By varying newly identified key parameters, we have not only recovered the well-known characteristics but also discovered crossover phenomena regarding phase and group velocities. We have examined frequency responses of the electric circuit underlying telegraph equations, thereby confirming the role as low-pass filters. By identifying a set of physically meaningful reduced cases, we have laid foundations on which we could further explore wave propagations over finite domain with appropriate side conditions.
Article
Physical Sciences
Optics and Photonics

Vladimir Saveljev

Abstract: The moiré effect is a physical phenomenon in periodic (or nearly periodic) structures. A straightforward approach does not enable us to fully understand this complex phenomenon and describe it in all its details. Therefore, modeling of the effect is often necessary. The combined simulation incorporates both physical and computer simulations. Computer tools for simulating the moiré effect in parallel layers and volumetric displays are presented, along with methods for replacing original microscopic objects with their macroscopic equivalents, thereby facilitating the development of a physical model. (It resembles an aerodynamic model of an aircraft or vehicle.) The combined simulation was made for 3D displays, cylindrical structures (single- or double-layered nanoparticles), and volumetric 3D structures. The results can be applied to nanoparticles, crystallography, and the improvement of the visual quality of 3D displays.

of 5,296

Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2025 MDPI (Basel, Switzerland) unless otherwise stated