Sort by

Article
Computer Science and Mathematics
Mathematics

Zewei Wang

,

Dan Xue

,

Yujia Zhai

,

Cong Li

Abstract: In this paper, we focus on the online stochastic optimization problems in which the random parameters follow time-varying distributions. At each round t, decision is obtained from solving current optimization problem.Then samples are drawn from distributions which are updated after obtaining decision. The objective and constraint are updated in this process, and the updated problem is used to obtain the next decision. For solving the online stochastic optimization problem, we propose a model-based stochastic augmented Lagrangian method, which is referred to as MSALM. At each round, we construct the model functions for the sample objective and constraint functions based on their properties, which reduced the computational complexity. The step size is designed in a dynamic form and decreases as t increases to accelerate convergence. Due to the setting of the online stochastic problem, we use stochastic dynamic regret and constraint violation to measure the performance of our algorithm. Under the assumptions, we prove that our algorithm’s stochastic dynamic regret and constraint violation have a sublinear bound of total number of slots T. We design simulation experiments to verify the efficiency of our online algorithm. Its performance is evaluated on a range of information and system engineering problems, including adaptive filtering, online logistic regression, the time-varying smart grid energy dispatch, the online network resource allocation, and the path planning. In addition, in the context of the path planning problem, we integrate our algorithm with supervised learning to demonstrate its enhanced capabilities. The experimental results validate the performance of our new algorithm in practical applications.

Article
Social Sciences
Library and Information Sciences

Victor Frimpong

Abstract: This paper argues that the institutionalisation of artificial intelligence (AI) disclosure in scientific research has resulted in a type of compliance that emphasises symbolic transparency rather than actual responsibility. Although journals and publishers are progressively mandating that authors disclose their use of AI, these policies remain fragmented, non-standardised, and largely unverifiable. Based on an exploratory review of 80 recent academic articles, the report demonstrates that explicit AI disclosure is limited and, when present, is primarily symbolic or narrative rather than verifiable. This work explains the pattern by introducing the concept of paper compliance and establishing the AI Disclosure Integrity Gap (AIDG), which is defined as the discrepancy between reported AI utilisation and the genuine epistemic impact of AI on research results. The analysis reveals that this disparity is systematically generated by the discordance between transparency-oriented governance frameworks and the iterative, opaque, and irreproducible characteristics of AI-assisted knowledge generation. The study develops testable propositions and introduces the AI Use Traceability Framework (AUTF) as a process-oriented approach for AI governance, emphasising traceability and auditability over transparency. Despite institutional, technical, and incentive-based obstacles that hinder full implementation, traceability provides a means to reduce AIDG and enhance accountability in AI-assisted research. The study advances AI governance and research integrity by treating disclosure as a limited mechanism rather than a complete solution, and by highlighting the risk that current practices create a false sense of transparency.

Review
Medicine and Pharmacology
Pulmonary and Respiratory Medicine

Hussein Mussa Muafa

,

Malika Abdu Balkam

Abstract: Background: Neoadjuvant therapy, particularly chemoimmunotherapy, has transformed the management of locally advanced non-small cell lung cancer (NSCLC). However, treatment-induced hilar fibrosis and tissue adhesions may increase the complexity of subsequent surgical resection, especially for technically demanding procedures such as sleeve lobectomy. The optimal surgical approach—robotic-assisted thoracic surgery (RATS), video-assisted thoracoscopic surgery (VATS), or open thoracotomy—remains uncertain in this setting. Methods: A systematic review and network meta-analysis (NMA) were conducted in accordance with PRISMA and PRISMA-NMA guidelines. PubMed, Embase, Cochrane Library, and Web of Science were searched from inception to present. Studies comparing RATS, VATS, and open sleeve lobectomy in NSCLC patients following neoadjuvant therapy were included, while mixed cohorts were excluded to ensure data homogeneity. Primary outcomes included postoperative complications, 30-day mortality, operative time, and intraoperative blood loss. Secondary outcomes included R0 resection rate, lymph node harvest, conversion rate, bronchial anastomosis time, and length of hospital stay. A frequentist network meta-analysis was performed. Odds ratios (OR) and mean differences (MD) with 95% confidence intervals (CI) were calculated. Heterogeneity was assessed using I² statistics, and treatment ranking was performed using SUCRA. Results: Five retrospective studies comprising 175 patients were included (RATS: 39, VATS: 114, Open: 22). Postoperative complications were comparable across approaches, with no statistically significant differences between RATS and VATS (OR 1.35, 95% CI 0.38–4.7), RATS and Open (OR 1.9, 95% CI 0.25–13.8), or VATS and Open (OR 0.22, 95% CI 0.03–1.6), although a trend favoring VATS was observed. Mortality rates were low and did not significantly differ between groups. Minimally invasive approaches (RATS and VATS) were associated with reduced intraoperative blood loss (MD approximately −70 to −100 mL) and shorter hospital stay (reduction of ~1–3 days) compared to open thoracotomy. RATS demonstrated a trend toward higher lymph node harvest (mean difference ~2–3 nodes) and showed a 0% conversion rate, whereas VATS conversion ranged from 4.7% to 30%. SUCRA ranking indicated that RATS had the highest probability of being the optimal approach (0.78), followed by VATS (0.64) and open thoracotomy (0.21). Heterogeneity was low to moderate (I² 0–40%), with no significant inconsistency detected. Conclusions: Minimally invasive sleeve lobectomy, including both RATS and VATS, appears to be safe and feasible for NSCLC patients following neoadjuvant therapy. RATS demonstrated favorable trends in technical outcomes, including lower conversion rates and improved lymph node harvest, and ranked highest in SUCRA analysis. However, given the limited sample size and observational nature of the included studies, these findings should be interpreted with caution. Further large-scale prospective and randomized studies are required to determine the optimal surgical approach in this setting. Systematic Review Registration: PROSPERO CRD420261358976.

Article
Engineering
Architecture, Building and Construction

Michele Versaci

,

Francesco Pittau

,

Iacopo Pizzutilo

,

Gabriele Masera

Abstract: The construction sector plays a central role in global resource depletion and waste generation, with construction and demolition activities accounting for more than one-third of total waste produced in the European Union. Despite growing interest in circular construction, one of the major barriers to large-scale material reuse is the lack of reliable information on the type, quantity, location, and availability of secondary materials in the urban environment. Existing planning tools rarely integrate material stock information into design and policy decision-making processes. Addressing this gap is essential for implementing circular economy strategies and enabling urban mining practices. This study presents the application of a spatially explicit bottom-up Material Stock Analysis (MSA) to quantify and map the embedded materials within an urban district of Milan. The research results in the creation of a secondary material cadaster and the estimation of material stock. The adopted methodology combines municipal GIS datasets, historical cartography, building archetype classification, and literature-derived material intensity coefficients. The final dataset is re-integrated into a geospatial environment to visualize material distributions and generate material-specific spatial analyses and heat maps. The study intends to support architects, urban designers, planners, and policymakers with decision-support information to guide design strategies, demolition planning, and resource governance at the district and metropolitan scales. The outcome aims at bridging architectural design knowledge with urban-scale material information through a replicable GIS-based workflow.

Article
Engineering
Other

Mohammad Sabaeian

,

Alireza Motazedian

,

Mostafa M. Rezaee

,

Fatemeh Sedaghat Jalil-Abadi

,

Mohammad Ghadri

Abstract: A numerical model is presented for heat-coupled continuous-wave second harmonic generation in a double-pass type-II potassium titanyl phosphate (KTP) cavity. The model solves eight coupled partial differential equations governing forward and backward ordinary and extraordinary fundamental fields at 1064 nm, forward and backward second-harmonic fields at 532 nm, three-dimensional transient heat diffusion, and thermally induced phase mismatching (TIPM). Given crystal geometry, beam parameters, pump power, and cooling boundary conditions, the solver produces spatiotemporal temperature distributions, phase-mismatch profiles, and electric-field amplitudes along the propagation axis. The implementation requires less than 8 GB of memory and runs on standard desktop hardware. Comparison with published experimental data yields agreement within 4 % in predicted conversion efficiency. The source code is available under the MIT License (v1.0.2, DOI 10.5281/zenodo.17362470).

Article
Chemistry and Materials Science
Applied Chemistry

Eliakim M. Kambale

,

David S. Rivera Rocabado

,

Yusuke Kanematsu

,

Takayoshi Ishimoto

Abstract: Whether copper fundamentally alters Mo-centered redox thermodynamics or mainly tunes hydrogen adsorption in Ni–Mo electrocatalysts under alkaline hydrogen evolution reaction (HER) conditions remains unresolved. Density functional theory calculations combined with a field-corrected computational hydrogen electrode framework are used to evaluate the thermodynamic stability of H3Mo, H3MoOH, H2Mo(OH)2, and MoO(OH)3 on Cu(111) and Ni(111) and to construct surface Pourbaix diagrams under electrochemical conditions. The results show that substrate identity reorganizes the redox stabilization hierarchy of these Mo intermediates. Across the examined conditions, at least one of H3Mo, H3MoOH, or MoO(OH)3 is thermodynamically favored over H2Mo(OH)2 on both surfaces. However, only Cu(111) exhibits measurable pH-dependent free-energy shifts, reaching 0.28 eV on the reversible hydrogen electrode scale. The magnitude of this electrostatic modulation is comparable to the intrinsic substrate-dependent relative Gibbs free-energy differences, suggesting that Cu reshapes Mo redox thermodynamics rather than merely weakening hydrogen binding strength. Electronic structure and vibrational analyses further show that Cu(111) preferentially weakens Mo–O interactions, whereas Ni(111) more strongly perturbs Mo–H bonding in hydrogen-rich complexes. Overall, these results establish that substrate identity governs the electrostatic modulation of Mo redox thermodynamics under alkaline HER conditions and provide a mechanistic insight into substrate effects relevant to Cu-containing Ni–Mo systems.

Article
Computer Science and Mathematics
Algebra and Number Theory

Frank Vega

Abstract: The Nicolas criterion gives an equivalent formulation of the Riemann Hypothesis as an inequality involving the Euler totient function evaluated at primorial numbers. A natural strategy for establishing this inequality is to prove that a suitable subsequence of the associated ratio sequence is eventually strictly decreasing under the assumption that the Riemann Hypothesis is false. The present work shows that such a subsequence exists. When this monotonicity property is combined with the known limiting behavior of the ratio sequence and the Nicolas equivalence, a contradiction emerges: assuming the Riemann Hypothesis is false forces the subsequence to converge to a limit that is simultaneously equal to $e^{\gamma}$ (by a subsequence argument) and strictly less than $e^{\gamma}$ (by strict monotonicity). The Riemann Hypothesis therefore follows as a direct consequence.

Case Report
Public Health and Healthcare
Public, Environmental and Occupational Health

Gudisa Bereda

Abstract: Organophosphate-induced delayed neuropathy (OPIDN) is a rare, serious neurological consequence of organophosphate poisoning. Unlike acute toxicity, which causes cholinergic crises, OPIDN develops insidiously, often weeks after exposure, leading to progressive sensorimotor deficits. 44-year-old African male pesticide applicator with nine years of organophosphate exposure presented with progressive lower limb weakness, gait disturbances, and paresthesia. The patient exhibited no signs of acute cholinergic symptoms. Neurological examination revealed symmetrical limb weakness, diminished deep tendon reflexes, and distal sensory deficits. Serum cholinesterase levels were decreased. Electrophysiological studies demonstrated axonal degeneration with demyelination, and MRI showed mild spinal cord atrophy. Other causes of neuropathy were excluded. He received supportive care, including physical therapy, pain and spasticity management, antioxidants, vitamins, and off-label intravenous methylprednisolone. Over four months, he regained partial functional improvement, with residual weakness and mild gait disturbance. Chronic low-level organophosphate exposure can cause OPIDN even without acute poisoning. Diagnosis relies on occupational history, neurological examination, and electrophysiological findings. Management is primarily supportive; off-label therapies such as methylprednisolone may reduce neuroinflammation and oxidative stress but are not part of standard care. Early recognition, timely preventive measures, and long-term rehabilitation are essential to improve functional outcomes and quality of life.

Essay
Arts and Humanities
Philosophy

Álvaro Acevedo

Abstract: This article critically examines the conceptual, historical, and epistemological foundations of bioethics as a transdisciplinary field that emerges in response to the ethical tensions produced by technoscientific development. Through an analytical and interpretative approach, the paper revisits the historical events that shaped modern bioethics, and the contemporary challenges that arise from the expansion of biomedical and technological interventions. The analysis highlights the persistent dilemmas involving autonomy, paternalism, vulnerability, and intercultural asymmetries. It also addresses the ethical impact of technoscience on the reconfiguration of life, death, and human nature. The article argues pluralistic and adaptive bioethics capable of sustaining epistemic vigilance and guiding decision-making processes in diverse and complex sociocultural contexts.

Article
Computer Science and Mathematics
Artificial Intelligence and Machine Learning

Sameer Kumar Singh

Abstract: We present MINT (Multilingual Indic Neural Transformer), a compact 14.7M parameter encoder-decoder architecture for abstractive summarization across seven Indic languages. MINT is designed specifically to operate within the memory envelope of a single commodity NVIDIA T4 GPU (15 GB VRAM), addressing the paradox in which models serving the most resource-constrained communities are themselves the most resource-intensive to deploy. The architecture incorporates Rotary Position Embeddings (RoPE), SiLU feed-forward activations, DropPath regularization, weight tying, and a custom 32,000-token SentencePiece Unigram tokenizer trained over balanced Indic corpora. Training proceeds in two phases on the XL-Sum BBC dataset across Hindi, Bengali, Marathi, Tamil, Telugu, Punjabi, and Urdu: a fluency phase (epochs 1-15) using linear warmup with cosine decay, followed by a refinement phase (epochs 16-25) with a flat low learning rate and a combined coverage-attention entropy loss that jointly penalizes repetition and hallucination. We conduct the first identical-regime comparison in Indic summarization, fine-tuning both IndicBART (440M parameters) and mT5-small (556M parameters) under the same loss function, optimizer, decoding strategy, and data pipeline as MINT’s refinement phase. On the XL-Sum test set, MINT achieves an average ROUGE-1 of 0.1187 at epoch 15, rising to 0.1302 on validation after full refinement, reaching approximately 84.8% of IndicBART’s ROUGE-1 (0.1409) on the six overlapping languages while using only 3.3% of its parameters. A critical methodological contribution of this work is the demonstration that the standard Google rouge_score library returns zero for all Indic scripts due to English centric tokenization; we implement and advocate for whitespace-based ROUGE evaluation as the correct approach. MINT additionally benefits from BERTScore-F1 of 0.8497 (via XLM-RoBERTa-Large) and LaBSE embedding cosine similarity of 0.4306, confirming that generated summaries carry semantic meaning even when surface overlap metrics are modest. All code and checkpoints are publicly released.

Article
Public Health and Healthcare
Public Health and Health Services

Huy Le Ngoc

,

Hoa Nguyen Binh

,

Giang Le Minh

,

Luong Dinh Van

Abstract: BackgroundMobile health (mHealth) interventions have shown promise in supporting tuberculosis care, but their association with patient knowledge, attitudes, and practices (KAP) among people with multidrug-resistant tuberculosis (MDR-TB) remains poorly evaluated in high-burden, programmatic settings. We assessed the association between a smartphone-based mHealth application and KAP regarding treatment adherence and adverse events within the V-SMART randomised controlled trial in Vietnam.MethodsThis nested cross-sectional study included 528 patients with MDR-TB (278 intervention, 250 control) enrolled across seven provinces in Vietnam between 2023 and 2025. KAP was measured using a validated questionnaire (Knowledge 0–17, Attitude 0–44, Practice 0–19, total 0–80). Multivariable linear regression adjusted for age, sex, province, education, time on treatment, PHQ-9, stigma, and social support. Dose-response relationships with self-reported app usage were examined in the intervention arm.ResultsThe mHealth intervention was associated with higher total KAP scores (adjusted mean difference 5.0 points, 95% CI 3.3–6.7, p<0.001), with largest gains in practice (+2.2 points) and knowledge (+1.1 points). Clear dose-response relationships were observed: each additional month of app use was associated with a 0.81-point increase in total KAP score (p<0.001).ConclusionThe smartphone-based mHealth intervention was associated with meaningfully higher KAP scores among MDR-TB patients in Vietnam. These findings support integration of mHealth tools into routine programmatic management of drug-resistant tuberculosis in high-burden settings.

Review
Computer Science and Mathematics
Robotics

William Lawless

Abstract: In this review article, we introduce the problem of team interaction, cover the mathematics, results, discussion, conclusions, and a path forward. To begin, cognitive science assumes a 1:1 relationship between beliefs and actions, whether with games, concepts, preferences, rational choices, eyewitness accounts, or self-reported pain. Unfortunately, it generalizes to reinforcement for generative-AI (gen-AI), a lower form of learning which can not account for higher-level cognition, the resistance of biases to be rectified, the inability to predict successfully without updated information, and supports Planck’s lament that physics evolves one funeral at a time. The problem with 1:1 beliefs-to-reality is that observations of social interaction only produce separable independent and identically distributed (i.i.d.) data, which, by definition, cannot reconstruct the interactions observed. Presently, Gen-AI uses separable i.i.d. data, preventing Gen-AI models from replicating interdependent human systems. Failing to account for interdependence, classical models of teams do not generalize, nor do their models predict advantages. Solving this problem is critical to advancing the science of teams arbitrarily composed of human-AI-machine-robot members. In contrast, based on interdependence, choosing how to “squeeze" uncertainty in our quantum-like (Q-L) model of teams, generalizes (e.g., vulnerability, espionage, time-energy tradeoffs), models self-organization’s ability to provide advantages (e.g., innovation) not possible under command decision-making (viz., authoritarianism), and may solve the hard-to-find connection between mind and reality. Our results suggest that humans have dual cognitive systems, one being cognition and the other embodied, but hidden, interdependence, which Simon was unable to capture and Kahneman had begun to address, our exemplar being Einstein’s decade-long struggle to construct his concept of general relativity. In the future, we propose that coupled tuning “squeezes" interdependent information to produce the advantages we have found over CDM and current AI risks.

Article
Computer Science and Mathematics
Computer Vision and Graphics

Yu Shang

,

Yinzhou Tang

,

Xin Zhang

,

Shengyuan Wang

,

Yuwei Yan

,

Honglin Zhang

,

Zhiheng Zheng

,

Jie Zhao

,

Jie Feng

,

Chen Gao

+2 authors

Abstract: World models have emerged as a pivotal research direction, with recent breakthroughs in generative AI underscoring their potential for advancing artificial general intelligence. For embodied AI, world models are critical for enabling robots to effectively understand, interact with, and make informed decisions in real-world physical environments. This survey systematically reviews recent progress in embodied world models, under a novel technical taxonomy. We hierarchically organize the field by model architectures, training methodologies, application scenarios, and evaluation approaches, thus offering researchers a clear technical roadmap. We first thoroughly discuss vision-based generative world models and latent space world models, along with their corresponding training paradigms. We then explore the multifaceted roles of embodied world models in robotic applications, from functioning as cloud-based simulation environments to on-device agent brains. Additionally, we summarize important evaluation dimensions for benchmarking embodied world models. Finally, we outline key challenges and provide insights into promising future research directions within this crucial domain. We summarize the representative works discussed in this survey at https://github. com/tsinghua-fib-lab/Awesome-Embodied-World-Model.

Article
Engineering
Other

Nicola Abeni

,

Riccardo Costa

,

Emilia Scalona

,

Diego Torricelli

,

Matteo Lancini

Abstract: Robotic assistive devices, such as exoskeletons, are increasingly employed in walking rehabilitation. Therefore, the measurement of both movement kinematics and cognitive workload is important to understand this human-robot interaction in real-world contexts. To address this need this study presents the validation of a framework integrating inertial motion capture (Xsens) and eye-tracking sensor (Pupil Neon) within a Mixed Reality (Meta Quest 3) architecture. We developed an overground dual-task paradigm in which holographic numbers appear in the user’s peripheral vision. This setup actively stimulates visuospatial attention while quantifying kinematic and cognitive output. To validate the framework, the protocol has been tested on 30 healthy subjects across repeated exoskeleton training sessions. Statistical analyses revealed that the Multiple Correlation Coefficient (CMC) and Spectral Arc Length (SPARC), calculated on the shank angular velocity, together with the Step Length Variability exhibited significant time effects (p < 0.01), mapping the transition toward automated gait. Concurrently, pupillometric data demonstrated a measurable reduction in neurocognitive demand; specifically, the Task-Evoked Pupillary Response (TEPR) decreased significantly across progressive training sessions (p < 0.05). With this work, we validated a measurements protocol that aims to provide a novel methodology for objectively evaluating motor and cognitive adaptation in wearable assistive devices.

Article
Computer Science and Mathematics
Robotics

Ashwin Kumar

,

P. Bavithra Matharasi

Abstract: Nano-UAVs weighing under 50g have become useful IoT platforms for GPS-denied navigation, but fitting a neural network into their sub-512kB memory and sub-100mW power budget remains an open engineering problem. PULP-Dronet v3 tackles this with depthwise separable (D+P) blocks and a channel-reduction factor γ. Even so, its most compressed variant (γ = /8, 1.1M MACs) loses 6 percentage points of collision accuracy versus the full model. Methods: We swap the 5×5 first convolution for a 3×3 depthwise + 1×1 pointwise pair, and retrain with cosine-annealing scheduling and per-epoch color-jitter augmentation. Results: At γ = /4 the model has 6409 parameters, needs only 540K MACs, and scores 83.97% collision accuracy with 0.372 steering RMSE on the official benchmark—+2.97pp over the same-γ baseline at 4.4× less compute. The full γ = /1 model (12M MACs) reaches 84%; our model nearly matches it with 22× fewer operations. Conclusions: Factorizing the stem and adjusting the training recipe recovers most of the accuracy lost to aggressive channel reduction, without adding inference cost.

Article
Public Health and Healthcare
Public Health and Health Services

Melvin Omone Ogbolu

,

Olanrewaju D. Eniade

,

Alex Ugochukwu Gbenimachor

,

Miklós Kozlovszky

Abstract: Background: Several research has revealed that dehydration remains a major cause of preventable illnesses, particularly among children and older adults. Existing tools such as the WHO IMCI, Gorelick, and Clinical Dehydration Scale (CDS) are limited by population focus and absence of quantitative weighting or digital integration. This study developed and prototyped an evidence-based dehydration-risk prediction model derived from meta-analytic data to enable more objective and universal risk estimation. Methods: Building on our recent systematic review and meta-analysis (Ogbolu et al., 2025), sixteen (16) clinical and demographic predictors were extracted from validated dehydration scales and pooled diagnostic evidence. Heuristic weights (1–4 points) were assigned according to pooled sensitivity and specificity, yielding a total score of 0–42. The total score was transformed to generate continuous probability estimates using logistic regression. The scoring algorithm was embedded within an interactive R Shiny software prototype that supports real-time computation and visualization. Prototype evaluation involved functional verification and usability testing using simulated patient profiles. Results: High-weight predictors, thirst, inability to drink, and lethargy showed the strongest diagnostic value, while modifiers such as age (≥ 65 years) and comorbidity carried lower weights. The cumulative score was transformed into a continuous dehydration-risk probability using a logistic function, reflecting the nonlinear increase in risk with symptom burden. Prototype evaluation of the MetaDehydrate application using simulated profiles demonstrated accurate score computation, consistent probability outputs, sub-second computation latency (< 0.2 s per calculation), and favorable usability feedback. Conclusion: This study presents the design and technical feasibility evaluation of an evidence-informed dehydration risk–scoring algorithm and its implementation as a prototype digital decision-support tool. While no clinical effectiveness was assessed, the findings demonstrate the feasibility of translating pooled diagnostic evidence into a functional, user-interactive application. The tool’s simplicity, limited input requirements, and rapid computation suggest potential utility for future evaluation in community and resource-constrained healthcare settings. Further prospective studies are required to assess effectiveness in real-world and low-resource healthcare settings.

Article
Physical Sciences
Theoretical Physics

Mário Sérgio Guilherme Junior

Abstract: We investigate the semiclassical structure of spin-foam transition amplitudes for boundary data that do not admit a real Lorentzian Regge geometry. Considering a fixed triangulation with a single dominant vertex, we demonstrate that when boundary tetrahedra carry mutually incompatible causal orientations, the closure equations have no real solution and the path integral is dominated by a complex Euclidean saddle of the Regge action. In this regime the vertex amplitude acquires a non-oscillatory factor of the form exp(−SE/ℏ), where SE is the Euclidean action evaluated at the complex saddle. We introduce a causal-obstruction criterion based on a convexity argument for the future timelike cone in R 3,1 , and establish a formal classification of boundary data into three types according to the existence and nature of the saddle-point solutions. We show that SE scales linearly with the spin parameter j in the semiclassical limit, SE = ℏ j C(α)/(8πG), where C(α) is a finite dimensionless geometric constant, providing explicit control over the suppression. Non-degeneracy of the Hessian at the complex saddle is verified after gauge fixing, confirming the validity of the saddle-point approximation. The results constitute a proof-of-concept demonstration that exponentially suppressed, causally confined quantum-geometric transitions emerge as a structural feature of the covariant formulation of loop quantum gravity, without additional postulates.

Communication
Biology and Life Sciences
Food Science and Technology

Ana Camile Assis de Jesus

,

Jaqueline dos Santos de Jesus

,

Beatriz Fernandes Vieira

,

Letícia de Jesus Tedgue

,

Ludimilla Adorno Vasconcelos

,

Radharani de Melo Serafim Ferreira

,

Pâmela Cristine Barroso de Almeida

,

Maria Eugênia de Oliveira Mamede

Abstract: This study aimed to evaluate how specific quantities of green tea and sugar, as well as fermentation temperature, impact variations in kombucha quality parameters. The study used tea quantities ranging from 0.5 to 3.0% w/v, sugar from 3.0 to 6.0% w/v, and fermentation temperatures of 20°C and 26°C. Beverage quality parameters such as pH, volatile acidity, alcohol content, and SCOBY growth were evaluated. At a temperature of 20 ± 2°C, formulations with 6.0% w/v sugar showed no SCOBY growth, and two for-mulations showed volatile acidity above the established maximum limit of 130 mEq L⁻¹. Most formulations had an alcohol content below 0.5% v/v and were classified as non-alcoholic. At 26 ± 2°C, the greatest SCOBY growth occurred, with the highest rec-orded value of 203%. Only two formulations showed an alcohol content above 0.5% v/v, but with values close to the limit. High amounts of sugar do not favor SCOBY growth at either mild or higher temperatures (26°C). Variations in temperature and ingredient quantities influence the production of green tea kombucha that meets safety and quality requirements. These data show the variations in ingredients and temperatures that favor kombucha production, considering its quality and classification.

Article
Biology and Life Sciences
Aquatic Science

Orkide Minareci

,

Ersin Minareci

,

Furkan Bilgic

,

Ergun Taskin

Abstract: The aim of the study is to determine physico-chemical parameters and eutrophication criteria in the Aegean Sea. The pH, temperature, salinity, dissolved oxygen, turbidity, conductivity, phosphate, ammonium nitrogen, nitrite nitrogen, nitrate nitrogen and chlorophyll-a parameters were determined. The sampling was conducted at 25 stations (Enez, Saros Bay, Gökçeada, Yeniköy, Bozcaada, Babakale, Altınoluk, Ayvalık, Dikili, Çandarlı, Foça, Bostanlı, Urla, Ildır, Çeşme, Sığacık, Kuşadası, Didim, Güllük, Bodrum, Akyaka, Gökova, Datça, Bozburun 1, Bozburun 2) during the spring-summer-autumn seasons of 2022 and 2023. In the Aegean Sea, the mean values were determined as follows: pH 8.05, temperature 21.80 °C, dissolved oxygen 7.86 mg/L, salinity 34.13‰, turbidity 25.15 mg/L, electrical conductivity 50.27 µS/cm, phosphate 9.39 µg/L, ammonium nitrogen 29.59 µg/L, nitrite nitrogen 0.5 µg/L, nitrate nitrogen 1.9 µg/L, and chlorophyll-a 1.76 µg/L.

Communication
Social Sciences
Decision Sciences

Rafael Garcia-Sandoval

Abstract: AI cannot be the property of just five or seven companies in the world because its development and evolution has been the work of hundreds of thousands of researchers and scientists who have been working on it for more than two centuries, some for over two millennia as Pythagoras, Euclid, Aristotle, Al-Khwarizmi and many others. They have left us their legacy in the form of the foundations on which AI stands today. AI is not solely the work of technology company CEO’s, as it has been demonstrated that they have used the intelligence, skills, knowledge and innovations of thousands of anonymous programmers and engineers. It is even less likely to be owned by a government that only cares about its own security and the financial and psychological control of its society. Artificial intelligence is a precious legacy of the work of the most important and valuable foundations on the binary system, originally known as Boolean logic and first described in The Mathematical Analysis of Logic a work published in 1847 by George Boole (1815, 1864) and Formal Logic, written by Augustus De Morgan (1806, 1871), to come together as a tool of incalculable mathematical value in the work of John Venn (1834, 1923) of 1894 in his book Symbolic Logic , from which the concepts for the mathematical treatment of sets and the practical application of the Boolean system were consolidated. Another valuable contribution is the research carried out by Santiago F. Ramón y Cajal (1852, 1934) (Spanish histologist) who obtained important results in his research on The Texture of the Nervous System of Man and Vertebrates (1904), results that were key to the application of artificial intelligence in neural networks. John Bardeen (1908, 1991) andWalter Brattain (1902, 1987) invented the transistor at Bell Laboratories in 1947, based on the theoretical work of Carl Ferdinand Braun (1850 - 1918). The name transistor was coined by John R. Pierce (1910, 2002). Other significant precursors include Gottfried Leibniz, Gottlob Frege, Bertrand Russell and Alfred North Whitehead, David Hilbert, Charles Babbage, John Von Neumann, Claude Shannon, Alan Turing, John McCarthy, Edward Feigenbaum, Douglas Lenat, Judea Pearl, Lotfi Zadeh, John Hopfield and Geoffrey Hinton, as well as hundreds of thousands of unknown engineers. Significant contributions have also been made by research laboratories such as Bell Labs and CERN, as well as thousands of academic research universities around the world. The future of second generation AI will be supported by the work of Thomas Fowler, Jan Lukasiewicz, [1] , [2] Alfred Tarski, Stephen Cole Kleene, the Setun project and scientists, universities and laboratories around the world who are carrying out balanced ternary or fuzzy logic research. The AI must be declared for all the above reasons and more: as part of the Cultural and Technological Heritage of Humanity.

of 5,791

Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2026 MDPI (Basel, Switzerland) unless otherwise stated