Sort by

Article
Computer Science and Mathematics
Algebra and Number Theory

Huan Xiao

Abstract: In this paper we first give a new formula of the Liouville function and then by using the method for proving the Bateman-Horn conjecture, we give a parallel proof of the Chowla conjecture.

Article
Public Health and Healthcare
Public Health and Health Services

Kiechan Namkung

,

Kanghyun Lee

Abstract: Background/Objectives: 40-Hz sensory stimulation is being explored as an everyday, non-pharmacological approach for cognitive-health applications, but sustained use depends on acceptability and implementable delivery/UX. We examined user-perceived acceptability and implementation considerations for a 40-Hz sine-wave–integrated soundscape intervention. Methods: Eleven adults aged ≥40 years in Seoul, Republic of Korea were assigned to waves or forest soundscapes (between-participants) and completed a within-participant comparison of 40-Hz–OFF (soundscape-only) versus 40-Hz–ON (soundscape plus an additively layered 40-Hz sine wave). Each condition comprised seven cycles of 50 s playback and 10 s silence (~7 min) with a 10 min washout. Participants completed a session-end 7-point Likert appraisal of the 40-Hz–ON stimulus and a semi-structured interview. Interview transcripts were analyzed using thematic analysis and interpreted using the Theoretical Framework of Acceptability and Proctor et al.’s implementation outcomes as sensitizing frameworks. Results: Likert appraisals indicated mid-to-high comfort and immersion (medians = 5) and moderate calmness (median = 4), with relatively low unpleasantness (median = 2). Perceived artificiality varied widely (range 1–7) and overall preference was moderate (median = 4). Interviews showed heterogeneous detectability of 40-Hz inclusion; acceptability depended on whether the layered component blended naturally or was perceived as mechanical/rumbling. Participants highlighted context fit (e.g., bedtime versus morning routines), “backgroundability,” and low-friction automation (timers/scheduling) as key sustainability factors, while emphasizing acoustic safeguards such as gentle onset and conservative default levels. Conclusions: A 40-Hz sine-wave–integrated soundscape can be acceptable, but responses are heterogeneous and sensitive to timbral salience and usage context. Scalable delivery should incorporate space-oriented playback options, simplified automation, acoustic safeguards, and coherence-focused guidance with appropriate disclaimers.

Article
Social Sciences
Government

Igor Calzada

,

Itziar Eizaguirre

Abstract: This article advances EcoTechnoPolitics as a transformational conceptual and policy rec-ommendation framework for hybridizing digital–green twin transitions under conditions of planetary polycrises. It responds to growing concerns that dominant policy approaches by supranational institutions—including the EU, UN, OECD, World Bank Group, WEF, and G20—remain institutionally siloed, technologically reductionist, and insufficiently attentive to ecological constraints. Moving beyond the prevailing digital–green twin transitions paradigm, the article coins EcoTechnoPolitics around three hypotheses: the need for planetary thinking grounded in (i) anticipatory governance, (ii) hybridization, and (iii) a transformational agenda beyond cosmetic digital–green alignment. The research question asks how EcoTechnoPolitics can enable planetary thinking beyond digital–green twin transitions under ecological and technological constraints. Methodologically, the study triangulates (i) an interdisciplinary literature review with (ii) a place-based analysis of two socially cohesive city-regions—the Basque Country and Portland (Oregon)—and (iii) a macro-level policy analysis of supranational digital and green governance frameworks. The results show that, despite planetary rhetoric around sustainability and digitalization, prevailing policy architectures largely externalize ecological costs and consolidate technological power. Building on this analysis, the discussion formulates transformational policy recommendations. The conclusion argues that governing plan-etary-scale ecotechnopolitical systems requires embedding ecological responsibility within technological governance.

Article
Medicine and Pharmacology
Psychiatry and Mental Health

Ngo Cheung

Abstract: For years, PTSD has been viewed mainly as something people "catch" after a bad experience, the result of an adult-onset failure in the brain's fear circuit. The standard story focuses on shaky extinction learning, a mis-tuned HPA axis, and glitches in glutamate-driven plasticity. Yet the newest, large-scale genetic studies paint a different picture. They show that the strongest polygenic signals sit not in classic glutamatergic genes but in the neuro-immune machinery that prunes synapses while the brain is still wiring itself. These signals hold up even after we account for every bit of shared variation with the glutamate system.Using MAGMA gene-set tests, partitioned heritability, and transcriptome-wide association built around the latest multi-ancestry GWAS of PTSD, we found repeatable, Bonferroni-safe enrichments in pruning regulators such as complement C4A, MHC-I gene HLA-B, the guidance cues SEMA3F and EFNA5, and the schizophrenia-linked transcription factors TCF4 and ERBB4. Together, the data point to what we call a "pruning-vulnerability cascade." First, genetically driven mis-pruning during sensitive windows leaves key circuits immature and easily rattled. Later trauma then overwhelms this shaky scaffold, setting off secondary failures in glutamate signalling and HPA feedback and locking them in through lasting epigenetic marks.Seeing PTSD as a problem that starts in development pulls together its early-life risk factors, the small hippocampi and other structural oddities on imaging, and its genetic overlap with disorders like schizophrenia. It also spotlights fresh prevention angles: for instance, dampening complement activity in young people who carry high pruning-risk profiles. To move from idea to intervention, we will need broad, multi-omic work in more diverse cohorts—but the roadmap is now clearer than ever.

Review
Medicine and Pharmacology
Pediatrics, Perinatology and Child Health

Carlos Domínguez-Vargas

,

Jesús Eduardo García-Hernández

,

Emiliano Peña-Durán

,

Samantha Jonnue Ramírez-Flores

,

Ulises Moisés González-Reyes

,

Ramsés Emiliano Martínez-Hernández

,

Daniela Alejandra Torres-Rodríguez

,

Paloma Marylí Prado-López

Abstract: Early-life exposure to toxic metals remains a major global public health concern, particularly for children, whose developing neuroendocrine and metabolic systems are highly vulnerable. Within the exposome and Developmental Origins of Health and Disease (DOHaD) frameworks, this narrative review synthesizes human evidence on pediatric exposure to lead (Pb), mercury (Hg), cadmium (Cd), and arsenic (As) and its associations with neurodevelopmental, metabolic, and endocrine outcomes. We primarily examined epidemiological studies, systematic reviews, and meta-analyses published between 2010 and 2025 that relied on biomonitoring-based exposure assessment and appropriate adjustment for socioeconomic status and passive smoking, while seminal earlier studies were considered to contextualize biological mechanisms and conceptual frameworks. The evidence for neurodevelopmental toxicity is the most consistent, with prenatal and early childhood exposure to Pb and Hg robustly associated with adverse cognitive, behavioral, and motor outcomes and no identified safe exposure threshold for lead. In contrast, associations with obesity and pubertal timing are more heterogeneous and metal-specific, reflecting nonlinear dose–response relationships, sex-specific susceptibility, and critical exposure windows. Emerging data indicate that metals may act as metabolism- and endocrine-disrupting chemicals, with effects amplified by mixture exposures and adverse social conditions, and partially modified by nutritional status. Overall, the findings support life-course–oriented, biomonitoring-based research and prevention strategies that address cumulative exposures and developmental vulnerability to reduce long-term disease risk.

Article
Biology and Life Sciences
Biology and Biotechnology

Akihiro Ishioka

,

Prihardi Kahar

,

Tasuku Nagano

,

Noor-Afiqah Ahmad Zain

,

Yutaro Mori

,

Chiaki Ogino

Abstract: Oleaginous yeasts are promising microbial platforms for lipid production from non-conventional carbon sources; however, acetate utilization is frequently constrained by physiological limitations associated with culture pH. In this study, acetate utilization, biomass formation, and lipid production by Lipomyces starkeyi were investigated under flask and fed-batch cultivation to evaluate the influence of culture pH and pH control strategy. Statistically supported flask-scale experiments demonstrated that acetate concentration and cultivation time significantly affected acetate consumption, biomass formation, lipid yield, and culture pH, with excessive acetate loading resulting in culture alkalization, incomplete substrate utilization, and reduced process performance. Although lipid yield increased with increasing acetate concentration, lipid content and fatty acid composition remained unchanged, indicating that enhanced lipid production was primarily attributable to increased biomass formation rather than to changes in lipid biosynthesis. Fed-batch cultivation under different pH-control strategies provided qualitative insights into the relationships among pH regulation, acetate availability, and lipid accumulation under controlled fermentation conditions. While lipid accumulation was observed under both HCl-based and acetic acid–based pH control, differences in pH stability and cumulative acetate availability were associated with distinct patterns of lipid production. Collectively, these results identify culture pH as a critical physiological parameter influencing acetate utilization and lipid accumulation in L. starkeyi and suggest that coordinated pH control and carbon feeding strategies may improve the robustness of acetate-based lipid production processes. Further replicated fed-batch studies will be required to quantitatively validate these trends and support industrial applications.

Review
Biology and Life Sciences
Neuroscience and Neurology

Abdullah Ayad

Abstract: Spinal cord regeneration requires a transformative strategy capable of rewriting inhibitory genetic programs while orchestrating real-time electrical communication with regenerating neural tissues. Recent advancements in precision CRISPR genome editing effectively silence or activate crucial molecular gatekeepers such as PTEN, SOCS3, and various epigenetic repressors, thereby reactivating dormant intrinsic regenerative pathways and enabling robust axonal growth. Concurrently, cutting-edge bioelectronic technologies utilizing piezoelectric, triboelectric, and magnetoelectric scaffolds have emerged, adeptly harnessing the body's inherent biomechanical energy. These innovative materials convert subtle physiological micromotions into finely tuned electrical stimuli, precisely guiding neuronal regeneration without external power sources, addressing limitations associated with traditional implants such as infection risks and mechanical incompatibility.Integrating these genetic modifications with bioelectric innovations creates a potent synergy. Genome-level reprogramming amplifies neuronal responsiveness to bioelectrical signals, markedly enhancing axonal regeneration. Simultaneously, autonomous electrical stimulation sustains and stabilizes cellular, metabolic, and synaptic improvements induced by genomic interventions, forming a closed-loop, self-sustaining therapeutic platform. This advanced system significantly transcends conventional transient recovery approaches, moving toward durable, personalized outcomes. Such convergence of advanced genetic engineering and intelligent biomaterial design represents a groundbreaking shift in regenerative neurology.Despite promising preclinical outcomes, significant translational challenges remain. Critical hurdles include ensuring precise delivery of CRISPR tools, mitigating off-target genomic effects, enhancing biocompatibility and scaffold stability, and navigating rigorous regulatory pathways. Addressing these challenges necessitates integrating next-generation gene-editing technologies, comprehensive genomic surveillance, advanced biomaterial sciences, and meticulous preclinical evaluations. Future directions in spinal cord injury research encompass multiplex genome editing, AI-driven scaffold optimization via digital twins, and tailored immune-evasive biomaterials. Collectively, this innovative approach has the potential to redefine regenerative medicine's boundaries, offering unprecedented hope for sustained, personalized recovery and dramatically improving quality of life for individuals affected by spinal cord injuries.

Article
Public Health and Healthcare
Public Health and Health Services

Oratilwe Penwell Mokoena

,

Solly Matshonisa Seeletse

Abstract: Background and Objectives: Formal education in Africa is becoming increasingly influ-enced by the traditional media consumption, ranging from television and radio to in-ternet usage. This study aims to determine the effects of traditional media consump-tion on health literacy in provinces with high illiteracy. Materials and Method: The study adopted a retrospective cross-sectional study design using the 2016 South Afri-can Demographic Survey Data to analyse the factors affecting health literacy. Partici-pants were selected using a stratified two-stage sampling method to ensure national and provincial representativeness. A total of 1 982 participants aged 15 – 29 years who met the inclusion criteria were included for further analysis. Pearson’s Chi-square test was used to test for association between health literacy and media consumption. Mul-tivariate logistic regression was used to determine the effects of traditional media consumption on health literacy, p ≤ 0.05 was considered statistically significant. STATA version 16.1 (StataCorp, LLC, College Station TX, USA) was used for analysis. Results: The results showed that media consumption emerged as a strong predictor, in-dividuals who reported watching television had increased odds of health literacy (OR = 2.67; 95% CI: 1.55 - 4.61; p < 0.001). Similarly, internet use was positive predictor of health literacy (OR = 3.11; 95% CI: 1.76 - 5.52; p < 0.001. Other variables such as educa-tional level also emerged as a significant predictor, individuals with secondary school education had significantly higher odds of health literacy compared to those with lower educational levels (OR = 17.10; 95% CI: 4.20 - 69.63; p < 0.001). Conclusion: This study highlights the critical role media consumption plays in shaping health literacy outcomes among the youth, particularly in provinces with high illiteracy rates. By us-ing media platforms strategically and ensuring equitable access, educators, health practitioners and policymakers can unlock new pathways to health literacy, fostering a more informed, empowered, and connected society.

Article
Computer Science and Mathematics
Computer Science

Tanzina Sultana

,

Asura Akter Sunna

,

Mohammed Majbah Uddin

,

Naresh Kshetri

Abstract: As artificial intelligence (AI) technologies, particularly generative and collaborative learning models— are increasingly integrated into healthcare and other sensitive domains, data privacy, security, and fairness concerns have grown significantly. This paper focuses on a thorough examination of current privacy-preserving AI models, including federated learning (FL), differential privacy (DP), homomorphic encryption, and generative adversarial networks (GANs). Key contributions are reviewed across recent works that explore privacy-preserving mechanisms within domains such as clinical diagnostics, drug discovery, Internet of Medical Things (IoMT), and virtual health systems. Dynamic federated models (e.g., DynamicFL) that adjust model architecture based on computational heterogeneity and encryption-augmented FL architectures are presented to maintain data locality while ensuring equitable performance. GAN-based synthetic data generators (e.g., medGAN, CorGAN) offer alternative solutions to share healthcare data without compromising patient identity and introducing new threats if misused. Across these models, a multi-phase life cycle of threats is identified—spanning data collection, model training, inference, and system integration—highlighting the importance of proactive governance. Information compliance frameworks such as the EU AI Act and the U.S. AI Bill of Rights are counting for standardizing technological implementation in healthcare data management. This research work will cover explaining existing AI models and trying to identify the best one worked for ensuring data privacy and shareability with ethical responsibility for proposing a layered privacy-preservation paradigm essential for safely deploying AI in sensitive environments.

Article
Engineering
Other

Akshay Bambore

,

Patrick Hendrick

,

Jean Philippe Ponthot

Abstract: The Wallonia region of Belgium aims to transition to a modern hydrogen infrastructure. Since hydrogen is much lighter than natural gas, so it is important to understand its nature and behavior while transporting through pipelines. This research aims to observe the pressure loss in pipelines due to surface roughness with H2 and other singular losses to find a solution to minimize the amount of pressure loss that occurs during transportation. This study involves numerical methods and gas equation models to determine thse pres-sure loss. This analysis includes the properties of hydrogen gas, pipeline material used, friction factor, pipeline efficiency, and other relevant properties of hydrogen and the pipe-lines. To address this challenge, this study integrates numerical fluid dynamics methods with structural modelling of pipeline walls. It accounts for long-term friction effects, erosion over several years, radial pressure gradients (mixing pressure drop), acceleration effects, and gravity influences, considering the non-ideal behavior of gaseous hydrogen (GH2).

Article
Engineering
Other

Julian Zenner

,

Bryan Rainwater

,

Daniel Zimmerle

Abstract: Methane emissions from end-use installations in residential natural gas systems remain poorly quantified, despite their importance to both safety and climate policies worldwide. While distribution networks and appliances have received research attention, interior piping between the meter and appliances represents a critical knowledge gap. To address this gap, a systematic survey of 473 residential systems in Saarlouis, Germany was conducted using standardized pressure-decay tests (DVGW G 600). Measurements were performed during the installation of gas regulators necessitated by a grid pressure increase from 23 mbar to 55 mbar above ambient. This provided a unique opportunity to assess whole-system leakage under controlled conditions without installation modifications. Leak rates were standardized to reference pressure and converted to methane emissions using measured gas composition. A total of 411 (86.9%) installations showed no detectable leak rate (LDL: 0.2 l h-1). However, seven systems (1.5%) exceeded 1 l h-1, and one surpassed the unacceptable threshold of 5 l h-1. Mean emissions across all systems were 0.067 [0.041, 0.098] g h-1, with smaller installations showing higher volume-normalized rates. Critically, fewer than 1.48% of systems contributed more than 46% of total emissions, demonstrating a strongly skewed, heavy-tailed distribution. Scaled nationally using Monte Carlo methods accounting for sampling uncertainty and skewed distributions, residential interior piping contributes 12.30 [8.11, 18.55] Gg yr-1 to Germany's methane emissions. These results emphasize the need to include residential leak rates in emission inventories and highlight the efficiency potential of targeted mitigation strategies focused on high-emitting installations under evolving EU methane regulations.

Article
Public Health and Healthcare
Physical Therapy, Sports Therapy and Rehabilitation

Francesco Alessi Longa

Abstract: Research shows that dual-task balance performance deficits serve as indicators which help predict future falls among elderly people. The research investigated how fallers (people who experienced two or more falls per year) performed compared to non-fallers during single and dual-task balance assessments. The research involved 24 community-dwelling participants between 65 and 80 years old who completed Balance Error Scoring System (BESS) and Timed Up-and-Go (TUG) tests while performing serial-7 subtraction.The study results showed that fallers made more BESS errors (M=18.4±4.2 vs. 11.2±3.1) and their dual-task TUG times were longer (M=14.8±2.1s vs. 11.2±1.5s) than non-fallers. The dual-task performance of fallers showed a significant decline of 25.4% compared to non-fallers who experienced a 12.1% decline (F(1,22)=8.92, p=.007, η²=.29).The research findings indicate that fallers experience more significant cognitive-motor interference which supports dual-process models. The research supports the need for motor-cognitive screening and training programs to prevent falls.

Concept Paper
Medicine and Pharmacology
Neuroscience and Neurology

Baikuntha Panigrahi

Abstract: Decision-making around disease-modifying therapy (DMT) in chronic neurological disorders is often framed as a linear optimization problem driven by efficacy and safety data. In practice, especially in resource-limited settings, therapy choice represents a strategic interaction among multiple agents, including patients, clinicians, health systems, and disease dynamics operating under uncertainty, constraints, and competing objectives. In this work, a game-theoretic perspective to formalize DMT selection as a structured decision space rather than a single optimal choice. Using simplified strategic models, trade-offs between efficacy, toxicity, affordability, adherence, and long-term disease control can be represented as equilibria emerging from interacting incentives rather than fixed hierarchies of outcomes. The framework highlights how rational local decisions may lead to globally suboptimal trajectories, and how dominance, cooperation, and delayed commitment can shape therapeutic pathways over time. This approach provides a conceptual tool for exploring strategy spaces and identifying conditions under which particular choices become stable or fragile. By making implicit clinical reasoning explicit, the framework offers a basis for hypothesis generation, policy discussion, and future quantitative modeling of treatment strategies in constrained healthcare environments.

Article
Medicine and Pharmacology
Pediatrics, Perinatology and Child Health

Safaa Alsayigh

,

Nuha Nimeri

,

Alaa Almashhadani

,

Amna Abdelgadir

,

Omar Haidar

,

Muhammed Talha

,

Ashraf Gad

Abstract:

Background: Neonatal hyperglycemia (NH) is a common metabolic complication among (NH) infants. However, early risk factors and clinical outcomes of NH remain unclear. Objective: To evaluate the association of NH with clinical outcomes and neurodevelopmental(NDD) risk in EP infants. Methods: This retrospective propensity score matching (PSM) study, included EP, born between 2018-2019 at women’s wellness and research Center who met the NH criteria (blood glucose >8.3 mmol/L). Hyperglycemia severity, maternal factors, delivery room interventions, early physiological markers, neonatal morbidities, and follow-up outcomes were compared. Propensity score matching (1:1) was used to adjust for significant baseline demographics and clinical characteristics. Results: Out of 225 EP infants, 131 (58.2%) developed NH in the first week of life of infants, with mild hyperglycemia in 31.0%, moderate in 14.6%, and severe in 11.1% of cases. Before matching, infants with NH were more preterm and had lower birth weight and head circumference. Their mothers had lower rates of premature rupture of membranes (PPROM). Affected infants required more surfactant in the delivery room and had higher oxygen and mechanical ventilation needs during the first week. After matching, NH was associated with significantly higher rates of ventilator-associated pneumonia (VAP), with 23.6% vs 3.7%, OR 8.04 CI: 1.72–37.66, p=0.003, longer duration of mechanical ventilation (19.8±25.3 vs 8.9±24.8 days, MD -10.942, CI -21.470–-0.420, p=0.042), higher postnatal steroid use (18.2% vs 5.5%, OR 4.64, CI 1.56–14.37, p=0.040) After matching, NH was associated with significantly higher rates of severe retinopathy of prematurity (ROP), ( 21.6% vs 6.4%, OR 4.03 CI: 1.04–15.50, p= 0.032).and trend towards moderate to severe bronchopulmonary dysplasia (BPD) (33.3% vs 15.9%, OR 2.64, CI 0.96–7.23, p=0.054). No significant differences in mortality were observed between the groups; however, infants with NH who died were older. Conclusion: Early NH in EP infants is associated with an increased risk of ventilator-associated pneumonia, prolonged mechanical ventilation, severe ROP, and moderate to severe BPD. These findings suggest that NH may contribute to poorer short-term outcomes in this vulnerable population.

Article
Public Health and Healthcare
Public Health and Health Services

Pau Alfonso-Comos

,

Álvaro Briz-Redón

,

José Luis Dapena Díaz

,

Susana Rives

,

José María Fernández Navarro

,

Jaime Verdú-Amorós

,

Adela Cañete

Abstract: Background: Childhood cancer is the leading cause of natural death among children in high-income countries, despite treatment improvements. The Spanish Registry of Childhood Tumours (RETI-SEHOP) systematically records all cases treated within the network of SEHOP units. Using RETI-SEHOP data, we evaluated survival trends to assess progress in patient care, both overall and by tumour. Methods: 20,534 childhood cancer cases (0-14 years) were recorded across the period 1999-2021. 1-, 3- and 5-year overall survival (OS) were estimated using the Kaplan-Meier method, applying the cohort approach for 1999-2018 and the period approach for 2019-2022. OS by age and sex was analysed in the recent 2009-2018 incidence cohort. Age-adjusted OS time trends were examined using joinpoint Cox models for 1999-2022. Results: For all tumours combined, 5-year OS increased from 75.4% to 84.6% between 1999-2003 and 2019-2022. While positive trends were identified for all haematological malignancies examined, a more varied scenario was in evidence for solid tumours: ependymomas improved fastest (1.51 points annually), and sarcomas, except rhabdomyosarcoma, remained stagnant. Conclusion: Our results reflect a period characterised by a combination of new therapeutic developments, improved diagnostics and more refined risk stratification, which has ultimately led to a reduction in disease-related mortality.

Article
Physical Sciences
Quantum Science and Technology

Arturo Tozzi

Abstract: Quantum entanglement is commonly characterized through global state descriptions on tensor product spaces, correlation measures or algebraic constructions, while local consistency constraints play no explicit structural role. We formulate entanglement as a combinatorial structure of overlapping local descriptions, drawing on De Bruijn graphs, where nodes represent overlapping contexts and paths encode globally coherent assemblies. We construct a graph whose nodes represent reduced quantum states on subsystems of fixed size and whose edges encode admissible extensions consistent with quantum mechanical compatibility conditions. Global many body states correspond to paths on this graph, while entanglement is reinterpreted as a property of graph connectivity and path multiplicity, rather than as a standalone numerical quantity. This formalism allows a separation between constraints imposed purely by local quantum consistency and additional structure introduced by dynamics, symmetries or boundary conditions, also clarifying how large-scale structural features may arise from local compatibility alone. Our graph-based formulation provides several advantages over conventional approaches. Supporting a unified treatment of static entanglement structure and dynamical evolution, it incorporates finite order locality and memory effects. Entanglement growth can be interpreted as path proliferation, while decoherence and noise correspond to the removal of admissible transitions. Our approach leads to testable hypotheses concerning the scaling of admissible state extensions, the robustness of entangled structures under local perturbations and the emergence of effective geometry from overlap constraints. Potential future directions include applications to many body reconstruction problems and comparative analysis of different classes of quantum states within a single combinatorial language.

Article
Physical Sciences
Theoretical Physics

Markolf H. Niemz

Abstract: Today’s standard model of cosmology is based on general relativity (GR). GR and special relativity (SR) work for all observers, but no spacetime diagram works for all observers. This is because SR/GR lack absolute space and absolute time. We present a model that is based on Euclidean relativity (ER). ER describes a mathematical Master Reality, which is absolute 4D Euclidean space (ES). All objects move through ES at the dimensionless speed C. There is no time in ES. All motion in ES is due to an external “evolution parameter” θ. Every object experiences two orthogonal projections from ES as space and time: The axis of its current 4D motion is its proper time τ. Three orthogonal axes form its 3D space x1, x2, x3. Observing objects is identical to projecting them from ES onto an observer’s physical reality, which is a Minkowskian reassembly of his axes x1, x2, x3, τ. In this “τ-based Minkowskian spacetime” (τ-MS), θ converts to absolute parameter time ϑ. ER predicts the same relativistic effects as SR/GR, but gravity is Newtonian. Action at a distance is not an issue: In timeless ES, information is instantaneous. Only in physical realities does the time coordinate cause a delay in information. Presumably, gravity is carried by gravitons and manifests itself in τ-MS as gravitational waves. ER does not require curved spacetime, cosmic inflation, expanding space, dark energy, and non-locality. And yet, ER predicts the arrow of time, galactic motion, the Hubble tension, entanglement, and more. We propose using ER in cosmology and quantum mechanics. Is ER the key to unifying physics?

Article
Computer Science and Mathematics
Artificial Intelligence and Machine Learning

Abuelgasim Mohamed Ibrahim Adam

Abstract: Reliable agentic AI requires not only accurate reasoning and adaptive control, but also mechanisms that preserve reliability over time. While recent work has introduced system-level evaluation frameworks (e.g., HB-Eval) and real-time control architectures (e.g., Adapt-Plan), the question of how reliability is retained across an agent’s operational lifespan remains largely unaddressed. Existing memory mechanisms typically store experiences based on recency or salience, inadvertently allowing low-quality behaviors to accumulate and degrade long-term performance.This paper introduces Evaluation-Driven Memory (EDM), a persistence governance layer that regulates long-term memory through certified evaluation metrics. EDM enforces selective consolidation, persisting only those trajectories that satisfy predefined reliability thresholds (e.g., Planning Efficiency Index, Trust Index), thereby preventing reliability regression. Conceptually, EDM reframes memory from a passive data store into an active governance mechanism situated between episodic execution and long-term knowledge accumulation.Empirical results demonstrate that EDM retains 50% fewer experiences while achieving 2× higher memory precision, reduces reasoning burden by 25% (CER=0.75\text{CER}=0.75CER=0.75), and maintains long-term stability (MRS=0.08\text{MRS}=0.08MRS=0.08) across repeated operational cycles. In contrast, flat memory architectures exhibit reliability degradation and increased cognitive load. We further position EDM within a coherent three-layer architecture—Evaluation (HB-Eval), Control (Adapt-Plan), and Persistence (EDM)—forming a closed trust loop for reliable agentic AI.These findings establish persistence governance as a necessary architectural principle for cumulative reliability, with implications for safety-critical systems, multi-agent collaboration, and human-AI interaction.

Review
Social Sciences
Urban Studies and Planning

Hannan V. Zubizarreta

,

Delfor Tito Aquino

Abstract: Purpose This study aims to systematically analyze how Environmental, Social, and Governance (ESG) frameworks have been integrated into the design, operation, and valuation of office buildings. In particular, it explores the interplay between green certification systems, employee well-being, governance practices, digital ESG monitoring, and the financial performance of ESG-aligned office investments. Design/methodology/approach Using the PRISMA 2020 methodology, a systematic literature review was conducted on peer-reviewed journal articles published between 2020 and 2025. A title-based query on Lens.org yielded 547 articles, of which 325 met inclusion criteria after two rounds of screening. Thematic analysis was employed to identify five major conceptual clusters Findings The review confirms that green certifications (e.g., LEED, BREEAM, WELL) are increasingly occupant-centric but often fall short of delivering consistent environmental outcomes without robust post-occupancy evaluation. Social sustainability literature underscores the role of workspace design, nature integration, and mental health strategies in supporting employee well-being. ESG reporting and governance practices remain fragmented, with limited employee voice, weak accountability mechanisms, and underdeveloped mobility reporting. Smart office studies highlight the convergence of IoT, AI, and human-centered design, while financial analyses reveal positive valuation effects and rental premiums for ESG-certified buildings, particularly in office sectors. However, methodological gaps and uneven adoption persist across contexts and disciplines. Originality/value This study provides one of the first interdisciplinary syntheses of ESG literature specifically focused on office buildings, combining insights from architecture, real estate, organizational behavior, and digital innovation.

Review
Biology and Life Sciences
Biochemistry and Molecular Biology

Thomas Brüser

,

Carsten Sanders

Abstract: The twin-arginine translocation (Tat) system is the only general pathway for the transport of folded proteins across energized biological membranes. It is found in the bacterial or archaeal cytoplasmic membrane, the plant thylakoid membrane or the inner membrane of plant mitochondria. The biological importance of this translocation system can be exemplified by the fact that no bacterial or plant photosynthesis and photosynthetic oxygen evolvement would exist on earth without this system. Despite many biochemical and biophysical studies, the Tat mechanism has been puzzling since the system was discovered in the 1990ies. Important characteristics of the Tat system could not be explained, and also recent high-resolution structures of the Tat system’s core with bound substrate did not lead scientists to a general transport mechanism. In this integrative review, we attempt to answer the key open questions with relevance to the Tat mechanism and thereby develop the first comprehensive explanation of how folded proteins are translocated across membranes by the Tat system.

of 5,416

Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2026 MDPI (Basel, Switzerland) unless otherwise stated