Sort by

Review
Engineering
Energy and Fuel Technology

M. Amir Siddiq

,

Salaheddin Rahimi

,

Jianglin Huang

,

Giribaskar Sivaswamy

Abstract: Marine renewable energy systems, including offshore wind, tidal, and wave technologies, are central to global net-zero strategies but remain constrained by reliability-driven costs and uncertainty in structural performance. In harsh offshore environments, interacting degradation mechanisms (such as corrosion–fatigue, hydrogen embrittlement, variable-amplitude loading, wear, and manufacturing-induced variability) govern failure, yet are not adequately captured by existing empirical design frameworks. This review presents a comprehensive, mechanism-based perspective on structural integrity in marine renewable energy systems, explicitly linking microstructure-sensitive deformation and damage processes to engineering-scale performance and reliability. The materials landscape, including structural steels, titanium alloys, fibre-reinforced composites, and additively manufactured materials, is critically examined with emphasis on process–structure–property–performance relationships. Multiscale modelling approaches are synthesised, spanning crystal plasticity finite element modelling, mesoscale damage formulations, fracture mechanics, structural reliability methods, and emerging digital twin and data-driven frameworks. A key contribution of this work is the integration of microstructure-resolved modelling with system-level reliability and qualification, addressing a critical gap between materials physics and engineering design standards. The review identifies critical limitations in current practices, including the lack of explicit treatment of coupled degradation mechanisms, insufficient representation of manufacturing variability, and the absence of consistent uncertainty propagation across scales. Building on these insights, an integrated, mechanism-resolved framework is proposed that combines multiscale modelling, manufacturing-aware qualification, inspection-informed updating, and hybrid physics–data approaches. This framework supports a transition from static, empirical design towards predictive, lifecycle-based structural integrity assessment, enabling improved reliability, reduced uncertainty, and more cost-effective deployment of next-generation marine renewable energy systems.

Review
Medicine and Pharmacology
Pharmacology and Toxicology

Marta Jóźwiak-Bębenista

,

Anna Stasiak

,

Monika Sienkiewicz

,

Paweł Kwiatkowski

,

Edward Kowalczyk

Abstract: Aging is associated with chronic, low-grade inflammation (“inflammaging”), which contributes to neuropsychiatric and neurodegenerative disorders such as depression, Alzheimer’s disease, and Parkinson’s disease. Conventional pharmacotherapies often provide limited benefit in older adults and are further complicated by polypharmacy and drug-drug interactions. Psilocybin, a serotonergic psychedelic acting primarily as a 5-HT2A receptor agonist and currently undergoing accelerated clinical development, has emerged as a potential multimodal therapeutic agent addressing these challenges. Acting via its active metabolite psilocin, 5-HT2A-mediated signaling biases cortical glutamatergic transmission, enhances TrkB/BDNF pathways, and modulates neuro-immune cascades (including NF-κB), with convergent systems-level effects such as re-organization of the default mode network. Human studies report acute reductions in TNF-α with variable effects on IL-6 and CRP, consistent with an immunomodulatory profile. Pharmacokinetically, psilocybin shows properties advantageous in geriatric care: rapid onset, short half-life, and predominant phase-II glucuronidation, reducing interaction risk. Controlled studies demonstrate rapid antidepressant and anxiolytic effects in major depressive disorder, treatment-resistant depression, and existential distress, with emerging feasibility signals in neurodegeneration. Together, these find-ings support the hypothesis that a time-limited, mechanism-based intervention may improve mood and cognition while attenuating inflammation. This review integrates current evidence on psilocybin’s neuroimmune and pharmacokinetic mechanisms rel-evant to aging, outlining its potential role in inflammation-related disorders and high-lighting the need for targeted studies in older adults, who remain underrepresented in psychedelic research.

Article
Biology and Life Sciences
Food Science and Technology

Bahram Faraji

,

James Wachira

,

Roshan Paudel

,

Akriti Dhakal

Abstract: Food fermentation is a widely used processing technique that enhances sensory properties, shelf life, and nutritional value, partly through the activity of beneficial microorganisms. This study investigated the microbial communities associated with traditional pearl millet fermentation and their potential nutritional contributions. Pearl millet (TiftLHB open-pollinated variety) was obtained from USDA-ARS and subjected to spontaneous fermentation in sterilized water at 28 ± 2 °C for 72 hours, followed by wet milling and an additional 72-hour fermentation. Microbial DNA was extracted, and 16S rRNA amplicon sequencing was performed after PCR amplification and quality control. Sequence data were analyzed using DADA2 and PICRUSt2 pipelines for taxonomic and functional prediction. The dominant bacterial genera identified were Weissella and Lactobacillus, both commonly associated with cereal fermentations. Weissella is known for reducing antinutrients and contributing to folate production, while the overall microbial profile was consistent with reports from other regions, including the presence of lactic acid bacteria such as Leuconostoc. These findings suggest that spontaneous fermentation of pearl millet supports microbial communities with potential nutritional and functional benefits. Metagenomic approaches may provide an effective strategy for identifying and optimizing beneficial microorganisms to enhance the nutritional quality and health-promoting properties of fermented cereal-based foods.

Article
Computer Science and Mathematics
Computer Science

Yuxia Qian

,

Yiwen Liang

,

Lei Shang

,

Xinqi Dong

,

Yincheng Liang

Abstract: Network access control and identity legitimacy verification have been implemented by establishing a secure foundation for the trusted establishment of communication entities. However, successful identity authentication alone does not guarantee secure communication. In open-network environments, it remains essential to establish a secure session key via a robust key agreement mechanism—one that prevents explicit disclosure of identity information while ensuring post-quantum security. To address these requirements, we propose a lattice-based key agreement protocol. The protocol integrates identity binding, implicit authentication, and session key establishment into a single ciphertext exchange. Furthermore, it supports secure key evolution and revocation verification through a version-control mechanism and a blockchain-maintained revocation list—thus realizing a comprehensive, post-quantum-secure key agreement scheme under reasonable computational and communication overhead.

Article
Environmental and Earth Sciences
Sustainable Science and Technology

Harsh Deep Singh Narula

Abstract: Artificial intelligence offers tremendous potential for landscape-scale biodiversity conservation, yet the significant energy consumption of large-scale AI models creates a fundamental paradox: the computing resources required to train and deploy these systems add to the very environmental degradation they seek to prevent. This paper proposes a multi-level, energy-aware AI architecture for constructing ecosystem digital twins that enables prescriptive, rather than merely descriptive or predictive, conservation management. The proposed framework classifies conservation tasks across three levels: classic machine learning for continuous environmental monitoring and species distribution prediction; deep learning for perception-oriented tasks such as computer vision and bioacoustics analysis; and foundation models for cross-domain synthesis and stakeholder interaction, where their capabilities are irreplaceable. We apply this architecture to a conceptual digital twin of the Greater Yellowstone Ecosystem, demonstrating how multi-tiered AI integration can model ecological systems spanning wolves, elk, vegetation, beavers, and hydrology to generate actionable, prescriptive insights concerning conservation. A comparative energy footprint analysis estimates that the tiered approach decreases computational energy consumption by approximately 62–74% relative to a foundation-model-centric baseline, while sustaining or improving conservation decision quality. This work addresses a key gap in the literature by providing the first integrated architectural framework that explicitly optimizes the trade-off between AI capability and environmental cost for landscape-scale conservation applications, supplying a replicable blueprint for resource-constrained conservation organizations worldwide.

Technical Note
Computer Science and Mathematics
Data Structures, Algorithms and Complexity

Xiang Meng

Abstract: The classical binary heap sink operation based on swap has a significant write overhead. We examine two intuitive improvements: swapping siblings (verified via bounded SMT search) and adding a local hint called pref (the hint-assisted variant). In our bounded SMT checks and implementation comparisons, we did not find evidence that these variants provide consistent benefits; PerfView measurements show the hint-assisted variant was slower in most configurations. Our results suggest that reverting to the straightforward hole-based sink is the practical choice for write-efficient implementations

Article
Biology and Life Sciences
Plant Sciences

Li Zhang

,

Tie Zhou

,

Yuxia Zhou

,

Yingshu Peng

,

Guolin Huang

,

Guimei Tang

,

Yang Liu

,

Yuanzhi Xiao

,

Fan Zhao

,

Weidong Li

+2 authors

Abstract: Wild orchid populations are declining with intensified habitat fragmentation, posing severe challenges to germplasm conservation. As an important ornamental Orchidaceae species, Cymbidium ensifolium has abundant germplasm resources and frequent natural and artificial hybridization. Long-term natural evolution and anthropogenic disturbance have led to complex genetic backgrounds and ambiguous phylogenetic relationships, hindering accurate germplasm identification, elite resource excavation, and selective breeding. As a distinctive variety, Cymbidium ensifolium var. susin has great breeding potential.Clarifying its phenotypic and genetic characteristics is crucial for accelerating breeding progress. In this study, phenotypic determination, Hyper-seq reduced-representation genome sequencing, SNP/InDel genotyping, genetic diversity analysis, and core collection construction were used to evaluate the genetic diversity, population differentiation, and core germplasm screening of 13 Cymbidium ensifolium var. susin accessions.The results showed significant phenotypic differences and rich genetic variation among tested materials. Based on highly weighted floral traits, accessions were divided into three major phenotypic groups. At the molecular level, 963,239 SNP and 182,399 InDel loci were identified, mainly distributed in intergenic regions, followed by introns and exons. A phylogenetic tree constructed from SNP loci, combined with principal component and phenotypic clustering analyses, clarified the genetic structure of pure-heart Cymbidium ensifolium var. Susin , showing a distinct geographical pattern: "high consistency in Fujian and Guangdong, strong differentiation in Southwest China, and a transitional gradient in central China".Meanwhile, six core germplasm accessions were screened in this study, which provides a solid theoretical basis and material support for the conservation of pure-heart Cymbidium ensifolium var. Susin accessions, variety improvement, hybrid parent selection, and molecular marker-assisted breeding. This is of great significance for promoting the innovation of chinese orchid germplasm resources and the high-quality development of the industry.

Article
Computer Science and Mathematics
Artificial Intelligence and Machine Learning

Chang Chia-Wei

Abstract: This study addresses the problem of zero-shot generalization (ZSG) in deep reinforcement learning by proposing an MNK game strategy learning method based on a Fully Convolutional Deep Q-Network (FCN-DQN). Research in deep reinforcement learning aims to develop algorithms that can generalize well to unseen environments at deployment time, thereby avoiding overfitting to the training environment. Solving this problem is crucial for real-world applications, where environments are diverse, dynamic, and inherently unpredictable. By constructing a fully convolutional reinforcement learning policy network composed entirely of convolutional layers with padding to preserve feature map dimensions, the proposed model is able to handle input boards of varying spatial sizes. The model effectively learns local pattern-based strategies and approximations of the k-in-a-row evaluation function rather than performing global search. Furthermore, due to parameter sharing, the network has a relatively small number of parameters and is able to share policy representations across different board scales, thereby improving both sample efficiency and inference efficiency. Experimental results demonstrate that, after being trained on a 3×3 board, the proposed model is able to achieve a certain degree of zero-shot generalization performance in larger, unseen board environments.

Article
Chemistry and Materials Science
Biomaterials

Danilo Escobar-Avello

,

Tomás Oñate Valdés

,

Victor Ferrer

,

Cecilia Fuentealba

,

Sergio Benavides-Valenzuela

,

Gustavo Cabrera-Barjas

,

Gastón Bravo-Arrepol

,

Ady Giordano

,

Beatriz Gullón

,

Jorge Santos

Abstract: Conventional and emerging extraction methods for recovering phenolic compounds (PCs) from Pinus radiata bark were investigated for their potential use in bio-composites and bio-based biomaterial applications. To optimize the recovery process, a Response Surface Methodology (RSM) based on a Box-Behnken design was used to evaluate the effects of extraction time (20–100min), temperature (20–80°C), and water or ethanol-water solvent concentrations with β-cyclodextrin (βCD) or NaOH (0.5–1.5% w/v CD/db). Polyphenolic profiles of the extracts were characterized using Fourier transform infrared spectroscopy (FTIR), LC-LTQ-Orbitrap-MS, and matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF-MS). Thermogravimetric analysis (TGA) and differential scanning calorimetry (DSC) were used to evaluate the thermal stability and degradation behavior of the powdered extracts. Antioxidant capacity (DPPH, FRAP, ABTS) and antibacterial activity against Escherichia coli and Staphylococcus aureus were assessed by spectrophotometric assays and the agar diffusion method, respectively. Highest extraction yields were obtained using alkaline extraction (14.32%) and ultrasound-assisted extraction (UAE) (13.86%), followed by ethanol extraction (12.74%). Minimum inhibitory concentration (MIC) for P-βCD was 0.04 mg/mL and the minimum bactericidal concentration (MBC) was 0.32 mg/mL against S. aureus. These results suggest a strong inhibitory capacity at low concentrations and the potential incorporation of these extracts into bio-based antimicrobial biomaterials.

Article
Biology and Life Sciences
Food Science and Technology

Kushnerenko V. G.

,

Fedorchuk O. M.

,

Riapolova I. O

,

Avercheva N. O.

,

Andreichenko A. O.

Abstract: This study evaluates animal welfare as a factor influencing economic losses and re-source use efficiency in the global livestock sector for 2025-2026. Using analytical as-sessment and conceptual modeling (TEL, WFW, and LDI models) based on FAO and USDA data, the study estimates the economic and environmental implications of pre-slaughter stress. The results suggest that global meat losses may reach approximately 19.4 million tons annually, with associated economic losses potentially exceeding USD 90 billion. These losses correspond to an estimated 184.2 km³ of freshwater and approximately 138.5 million tons of feed resources, indicating reduced efficiency in resource utilization. The findings highlight that improving animal welfare may represent not only an ethi-cal consideration but also a potential approach to enhancing resource efficiency and sustainability in livestock production systems. The study supports the integration of welfare-related parameters into agricultural and food system policies, particularly in regions affected by logistical disruptions.

Article
Physical Sciences
Theoretical Physics

Georgios I. Alamanos

Abstract: Understanding whether the mathematical structure of quantum mechanics is fundamental or emergent remains a central question in the foundations of physics. In particular, the special role played by time in quantum theory, appearing as an external evolution parameter rather than a dynamical observable, suggests that the formalism itself may arise from deeper structural considerations. In this work, we investigate the emergence of quantum mechanical formalism from classical wave dynamics by adopting a dimensional framework in which time is treated as a +1 evolution parameter relative to the dimensions through which physical phenomena (fields or disturbances of a field) propagate and interact. Within this perspective, different fields may evolve with respect to different effective dimensions, while remaining embedded in a common higher-dimensional space, allowing time to acquire a relational and context-dependent role. This means that in our proposed model, time is not a fixed dimension which is experienced the same way for every field or field interaction of any dimensionality. In that sense, time for one physical phenomenon can behave as space for a higher dimensional physical phenomenon, whose time is a different +1 dimension. The central objective of this paper is to determine how a higher-dimensional deterministic field can be consistently represented by a lower-dimensional description that lacks direct access to its full set of evolution parameters and evolves through a spatial (for the higher-dimensional field) dimension. To this end, we introduce a general projection framework in which a higher-dimensional field is mapped to a reduced field through an interaction-based recording process. Crucially, we do not assume the form of this mapping a priori. Instead, we impose the requirement that it preserve the maximum amount of physically accessible information. In particular, we demand the faithful encoding of phase relations, interference structure, and spectral composition, including the relative contributions of different Fourier modes and their superposition. We first demonstrate, within a purely classical 3+1-dimensional wave framework that these constraints severely restrict the admissible form of the reduced description and naturally lead to complex amplitudes, linear superposition, Hilbert space structure, and canonical operator relations. This analysis provides an intuitive and mathematically explicit route to quantum-like descriptions without assuming quantum postulates. We then generalize the construction to a 4+1-dimensional framework, introducing an additional evolution parameter and showing that under the same information-preserving constraints, the Schrödinger equation appears as an effective low-energy description of the reduced dynamics, while a relativistic dispersion relation emerges simultaneously through the encoding of the hidden evolution parameter as an invariant frequency scale. In this way, both quantum mechanical and relativistic structures arise from the same underlying requirement: the consistent and information-preserving representation of higher-dimensional wave propagation in a lower-dimensional observational framework. The results suggest that the formal structure of quantum mechanics need not be postulated a priori, but may instead be understood as the unique mathematical language required to encode the observable remnant of a higher-dimensional deterministic dynamics under strict constraints of symmetry, invariance, and information preservation.

Review
Computer Science and Mathematics
Artificial Intelligence and Machine Learning

Chong Ho Yu

,

Nino Miljkovic

,

Zhaoyang Wang

Abstract: Today, data are no longer confined to numerical values arranged in row-by-column matrices or stored neatly within relational databases. One of the defining characteristics of big data is its high variety, encompassing unstructured and multimodal forms such as text, audio, images, and video. These data types dominate contemporary domains including social media, digital humanities, biomedical research, education, and surveillance systems, yet they remain difficult to manage and analyze using traditional data management architectures. To cope with this shift, modern data management systems must move beyond schema-driven designs and incorporate multimodal artificial intelligence capable of understanding, integrating, and reasoning across heterogeneous data modalities. This article examines how multimodal AI—particularly large multimodal foundation models—can be leveraged to support the ingestion, representation, organization, and analysis of unstructured data. It discusses emerging multimodal data management frameworks, outlines a conceptual pipeline for multimodal data analysis, and highlights key challenges related to scalability, interpretability, and governance. By situating multimodal AI at the core of data management, this work argues that effective data analysis in the era of big data requires systems that treat meaning, context, and cross-modal relationships as first-class computational objects rather than afterthoughts.

Article
Public Health and Healthcare
Health Policy and Services

Ya-Min Yang

,

Yi-Wei Wang

,

Ahuva Averin

,

Anu Suokas

,

Mark Atwood

,

Mary MacKinnon

,

Liping Huang

Abstract: Background/Objectives. The 13-valent pneumococcal conjugate vaccine (PCV13), followed by the 23-valent pneumococcal polysaccharide vaccine (PPV23; PCV13→PPV23), was recommended for adults with high-risk conditions and all adults aged ≥65 years. With the availability of the 20-valent PCV (PCV20), which provides broader protection against pneumococcal disease, Taiwan CDC recently recommended to replace PCV13→PPV23 with PCV20. However, there is no economic evidence to support the recommendation. Therefore, the objective of the study is to assess the short- and long-term clinical and economic value of PCV20 to support the recommendation. Methods. A lifetime cost-effectiveness analysis (CEA) based on a single cohort and five-year budget impact analysis (BIA) based on rolling cohorts were conducted from a healthcare system perspective to evaluate replacing PCV13→PPV23 with PCV20 among high-risk adults aged 18–64 years and all adults aged 65–99 years. Results. In CEA, PCV20 was estimated to reduce pneumococcal disease cases by 4,684 and deaths by 160 among the model population (N = 5.5M) over a lifetime horizon. Total costs decreased by NT $2.3 billion while quality-adjusted life-years (QALYs) increased by 944, making PCV20 the dominant strategy versus PCV13→PPV23. BIA showed budget savings of NT $5.4 billion over five years including NT$2.4 billion in the first year. Conclusions. Switching to PCV20 for adults with high-risk conditions and all adults aged ≥ 65 years would substantially reduce the burden of pneumococcal disease and related deaths, leading to cost and budget savings for Taiwan's healthcare system.

Article
Medicine and Pharmacology
Hematology

Kenichi Ito

,

Tomoko Kitagawa

,

Saya Motohashi

,

Kazuhiko Hirano

,

Naohiro Sekiguchi

Abstract: Background: Cold agglutinin-associated hemolysis (CAH) occurs in diverse clinical con-texts, including primary cold agglutinin disease (pCAD) and Waldenström macroglobu-linemia-associated cold agglutinin syndrome (WM-CAS). The differentiation of these enti-ties is often challenging, particularly in MYD88 L265P-negative cases. Since studies that examined the effects of chemoimmunotherapy (CIT) frequently predated routine molecular testing, it remains unclear whether the disease classification has an impact on treatment responses and durability. Methods: We retrospectively analyzed patients with pCAD, WM-CAS, and WM without CAS (WM-only) treated at a single center between April 2010 and November 2025. Diagnoses followed the revised fifth edition of the WHO Classifica-tion of Haematolymphoid Tumours. Clinicopathological features and outcomes were compared. Treatment responses were assessed using CAD-specific criteria based on he-moglobin and hemolysis markers. Time to next treatment (TTNT) and overall survival (OS) were analyzed using Kaplan-Meier methods. Results: Ten patients had CAH (5 pCAD, 5 WM-CAS) and 29 had WM-only. CAH cases showed higher lactate dehydrogenase and total bilirubin levels, whereas WM-only cases had higher serum IgM levels and greater bone marrow involvement. Rituximab-based CIT predominated as the first-line therapy for both pCAD and WM-CAS. Overall response rates were 100% in pCAD and 80% in WM-CAS, with similar kinetics. TTNT did not significantly differ between pCAD and WM-CAS. Elevated FDP was more frequent in CAH, while no overt thrombotic events or grade ≥3 infections were observed. Conclusions: Rituximab-based CIT effectively con-trolled CAH in both pCAD and WM-CAS, with similar durability. When hemolysis is the dominant clinical issue and the disease classification is unclear, tumor-directed therapy represents a treatment strategy.

Article
Social Sciences
Media studies

Safran Safar Almakaty

Abstract: This study presents a comprehensive qualitative synthesis and critical analysis of the transformation of foundational mass communication theories in the digital media age, spanning the period from 2000 to 2025. Drawing on a systematic integrative review of 23 scholarly manuscripts encompassing over 600 peer-reviewed sources, the investigation examines how ten canonical communication theories—Agenda Setting Theory, Cultivation Theory, Framing Theory, the Two-Step Flow of Communication, the Spiral of Silence, Uses and Gratifications Theory, Media Dependency Theory, Gatekeeping Theory, Diffusion of Innovation Theory, and Technological Determinism—have evolved, converged, and been reconceptualized in response to the affordances and constraints of digital platforms, algorithmic mediation, and networked communication environments. Employing a reflexive thematic analysis methodology grounded in a critical realist epistemology, the study identifies six overarching meta-themes: (a) the emergence of algorithmic agency as a structural force reshaping all theoretical paradigms, (b) the dialectical tension between expanded user agency and platform-imposed constraints, (c) the increasing platform specificity of communication effects, (d) the convergence and theoretical integration of formerly discrete paradigms, (e) persistent global inequities in digital communication power structures, and (f) the implications of generative artificial intelligence for foundational communication theory. Findings reveal that while the core premises of classical theories retain explanatory value, their operative mechanisms, boundary conditions, and societal implications have undergone fundamental transformation. An expanded and substantiated version of the Algorithmic Communication Ecology Model (ACEM) is proposed, synthesizing insights across all ten theories within a four-dimensional integrative architecture. Thirty-two specific recommendations for future research are formulated across eight thematic areas, directly addressing persistent gaps in the literature. The study contributes to WOS- and Scopus-indexed communication scholarship by providing a unified analytical lens through which the simultaneous preservation, disruption, and reconstitution of mass communication’s theoretical foundations can be systematically understood.

Article
Computer Science and Mathematics
Artificial Intelligence and Machine Learning

Marian Pompiliu Cristescu

Abstract: Citizen-reporting platforms generate high-volume, multilingual streams of service requests, yet operational triage often relies on coarse category labels and manual inspection. This study develops an explainable, calibration-aware analytics pipeline for FixMyStreet Brussels reports, combining text-based urgency modeling, topic discovery, and spatio-temporal hotspot scoring to support municipal decision-making. From 522,132 raw reports, we build an English-normalized text field for modeling, derive resolution-time outcomes from closed cases, and curate a 1,000-item gold standard with an explicit high-urgency class. A TF–IDF logistic regression baseline achieves strong classification performance and, after probability calibration, yields well-behaved confidence estimates suitable for risk-aware prioritization. Topic-level analyses reveal dominant themes related to sidewalks, road damage, and bulky waste, and hotspot scores highlight persistent, high-impact issue clusters. Event detection on aggregated signals did not identify statistically significant shocks during the analysis window, suggesting that the observed dynamics are driven by chronic, recurring problems rather than abrupt anomalies. Explainability audits via SHAP expose linguistically intuitive drivers for urgent cases (e.g., dangerous, risk, accident) and complaint-oriented terms (e.g., abandoned, illegal, dirty), providing transparent hooks for governance review.

Article
Computer Science and Mathematics
Artificial Intelligence and Machine Learning

Rao Xu

,

Yun Yang

,

Jiarong Qiu

,

Hengguang Cui

,

Yilin Sun

,

Zhongkang Li

Abstract: Decentralized federated learning (DFL) eliminates the single point of failure inherent in server-based architectures, enabling peer-to-peer collaborative model training. However, the absence of a central authority makes DFL particularly vulnerable to Byzantine attacks from malicious participants. Existing Byzantine-robust methods often fail to exploit the network topology structure of DFL. We propose TrustGraph-DFL, a novel defense mechanism that leverages graph-based trust modeling for Byzantine resilience. Our key insight is that consistency between a neighbor's model update direction and a node's local validation gradient can serve as an effective trust indicator. Each node computes consistency scores by comparing received updates against locally computed validation gradients, then maps these scores to dynamic edge weights for robust weighted aggregation. Experiments on CIFAR-10 demonstrate that TrustGraph-DFL achieves 3--5% higher accuracy than existing methods under 30% Byzantine nodes while maintaining a low false positive rate (approximately 9% at 50% Byzantine fraction, compared to 35% for Krum).

Article
Engineering
Mechanical Engineering

Wei Guo

,

Xin Li

,

Ce Shi

,

Jinhong Li

,

Zemin Sun

,

Yongjia Xu

Abstract: Shaking tables are critical facilities for simulating seismic effects via ground motion reproduction. However, single-table tests are often constrained by limited platform dimensions and load capacity. While multi-table synchronization overcomes these bottlenecks, traditional array control methods under rigid connections face challenges, including degraded precision from desynchronization and experimental interruptions due to output forces exceeding safety limits. To address high-precision synchronization requirements for rigid-connected dual-table arrays, this study proposes an impedance-based internal force coordination control strategy. This approach enhances synchronization accuracy and prevents failures from excessive coupling forces. Specifically, a global simulation model and a mechanical model of the dual-shaking table array under rigid connection were established. Through simulation and experimental validation, the impact of synchronization errors was evaluated and the strategy's efficacy verified. Results show the strategy significantly reduces force discrepancy between platforms. In simulation and experiments, average force discrepancy reductions reached 95.4% and 76.1%, respectively. Both displacement reproduction accuracy and synchronization precision improved. The method effectively circumvents experimental bottlenecks, such as output force saturation, inherently associated with rigid connections.

Article
Biology and Life Sciences
Horticulture

Jason W. Miesbauer

,

Edward F. Gilman

,

Andrew K. Koeser

,

Richard J. Hauer

,

Abigail C. Tumino

,

Chris Harchick

Abstract:

Background: When branches lack a defined collar, arborists are left without a clear target to guide removal pruning. A common recommendation is to cut at a 45° angle from the branch bark ridge. Cutting perpendicular to the branch axis as an alternative would minimize effective wound size, potentially reducing wood dysfunction in the remaining stem. Methods: A total of 92 Acer rubrum L. ‘Florida Flame’ and 102 Quercus virginiana Mill. ‘Highrise’ branches without visible collars were pruned one of two ways: 1.) removal cut angle 45° from the branch bark ridge (45°) or 2.) removal cut angle perpendicular to the branch axis (perpendicular). Three years later, pruned areas were harvested and assessed for wound closure and internal discoloration and decay, controlling for initial branch diameter, branch-to-parent-stem aspect ratio, sprout growth, and branch height. Results: In live oak, branch size and cut method affected the amount and length of discoloration observed. In red maple, discoloration and decay were largely a function of branch size and aspect ratio (i.e., the relative size difference between the removed branch and parent stem). In both species, cambial dieback was more common with perpendicular removal cuts, often negating any initial benefit associated with the smaller wound. Conclusions: When removing branches without a branch collar, we recommend making 45° cuts. Identifying which branches to remove or retain early in a tree’s life is important to avoid large branch removal cuts later.

Article
Medicine and Pharmacology
Orthopedics and Sports Medicine

Łukasz Stołowski

,

Gino Kerkhoffs

,

Tomasz Piontek

Abstract: (1) Background: Femoroacetabular impingement syndrome (FAIS) is a common cause of hip pain and functional limitation in young and physically active individuals. Although hip arthroscopy is an established treatment when conservative management fails, objective data on early postoperative changes in active hip range of motion (ROM) remain limited. This study aimed to evaluate changes in active hip ROM three months after arthroscopic treatment for FAIS using inertial measurement units (IMUs) and to investigate their relationship with patient-reported outcomes. (2) Methods: A prospective cohort of patients undergoing hip arthroscopy for FAIS was assessed preoperatively and at a three-month follow-up. Active hip ROM—including flexion, internal rotation, external rotation, and total rotation—was measured using IMU sensors, while subjective outcomes were evaluated using the Hip disability and Osteoarthritis Outcome Score (HOOS). (3) Results: Significant improvements were observed across all HOOS subscales at follow-up. Active hip ROM increased significantly in internal rotation, external rotation, and total rotation of the operated hip, whereas changes in hip flexion were minimal and no meaningful changes were observed in the non-operated hip. (4) Conclusions: Hip arthroscopy for FAIS leads to early improvements in both patient-reported outcomes and active hip mobility, particularly in rotational movements, although the relationship between ROM and subjective outcomes appears weak.

of 5,801

Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2026 MDPI (Basel, Switzerland) unless otherwise stated