REVIEW | doi:10.20944/preprints202305.1247.v1
Subject: Chemistry And Materials Science, Organic Chemistry Keywords: Sequential Reactions; Aminolkynes; Heterocycles; Metal catalysis.
Online: 17 May 2023 (12:25:48 CEST)
Sequential reactions of aminoalkynes represent a powerful tool to easily assembly biologically important polyfunctionalized nitrogen heterocyclic scaffolds. Metal catalysis often plays a key role in terms of selectivity, efficiency, atom economy and green chemistry of these sequential approaches. This review examines the existing literature on the applications of reactions of aminoalkynes with carbonyls which are emerging for their synthetic potential. Aspect concerning the features of the starting reagents, the catalytic systems, alternative reaction conditions and the pathways as well as the possible intermediates are provided.
ARTICLE | doi:10.20944/preprints202305.0545.v1
Subject: Chemistry And Materials Science, Applied Chemistry Keywords: Wintergreen oil; niobium pentachloride; sequential reactions; antimicrobial activity; cytotoxicity; S. aureus
Online: 8 May 2023 (14:51:06 CEST)
Methyl Salicylate (MS), the principal constituent of Wintergreen oil (WO) was obtained from acetyl salicylic acid (ASA) by sequential transesterification-esterification reaction promoted by NbCl5for the first time. The reagents were added simultaneously, and the reaction process involved the transesterification and esterification reactions, which were accompanied by thin layer chromatography and gas chromatography. The conversion rate via GC was 100%, and the MS yield was 94%. A cytotoxicity of 50% and 64% for cultured S. aureus and metastatic melanoma cells, respectively, was observed for a concentration of 0.6 mg/mL, whereas no cytotoxicity for non-tumor cells was observed for this concentration, and it is considered to be the optimum concentration.
ARTICLE | doi:10.20944/preprints201711.0155.v1
Subject: Engineering, Electrical And Electronic Engineering Keywords: millimeter-wave; radiometer; sequential lobing
Online: 23 November 2017 (11:29:03 CET)
The paper investigates the theory of operation of a passive millimeter-wave seeker sensor using fast electronic sequential-lobing technique and the experimental validation obtained through laboratory trials. The paper analyzes in detail the theoretical performance of a difference channel sensor and a pseudo-monopulse sensor deriving agile formulas for the estimation of target angular tracking accuracy and the subsequent experimental validation.
ARTICLE | doi:10.20944/preprints202209.0389.v1
Subject: Chemistry And Materials Science, Analytical Chemistry Keywords: Yield; Fermentation; Sequential Pretreatment; Sorghum Straw
Online: 26 September 2022 (09:11:56 CEST)
The depletion and environmental problems associated with fossil fuels have encouraged us to look for alternative feedstock that do not compromise food security and the environment. Sorghum is a fast-growing crop that can be harvested twice a year and produces both food and straw that can be utilized in the production of bio-based fuels. In this study, the production of bioethanol and the effects of fermentation parameters on ethanol yield are presented. A sequential pretreatment method was employed, using dilute sulfuric acid (1%) at 125 °C in the first stage and dilute sodium hydroxide (1.25 %) at 90 °C in the second stage. The residues left after the sequential pretreatment stage were hydrolyzed using acid hydrolysis. The sugar concentration of the hydrolysates was determined using the phenol sulfuric acid method, and three hydrolysates having sugar levels of 30.42 g/L, 31.79 g/L, and 32.9875 g/L were selected for fermentation. The ethanol yield was determined after 72 hours of fermentation at 30 °C with varying inoculum sizes (5%, 10%, and 15%) and pH (4.5, 5, and 5.5). With a maximum ethanol yield of 0.617 mL/g (48.742%) produced at a sugar content of 32.9875 g/L, pH of 5, and inoculum size of 15%, statistical analysis showed that all three independent parameters affected ethanol yield. According to these findings, while raising sugar content, inoculum amount, and pH all initially result in higher ethanol yields, doing so further reduces yield. So, in order to increase ethanol yield, fermentation conditions must be carefully managed while producing ethanol from sequential acidic-alkaline pretreated sorghum straw. The strategy followed by using sequential acidic-alkaline pretreatment of sorghum straw provides prospects for efficient and effective production of biofuels from alternative feedstock.
ARTICLE | doi:10.20944/preprints201709.0001.v1
Subject: Engineering, Industrial And Manufacturing Engineering Keywords: sequential methods; change-point detection; online algorithms
Online: 1 September 2017 (05:24:10 CEST)
Sequential hypothesis test and change-point detection when the distribution parameters are unknown is a fundamental problem in statistics and machine learning. We show that for such problems, detection procedures based on sequential likelihood ratios with simple one-sample update estimates such as online mirror descent are nearly second-order optimal. This means that the upper bound for the algorithm performance meets the lower bound asymptotically up to a log-log factor in the false-alarm rate when it tends to zero. This is a blessing, since although the generalized likelihood ratio (GLR) statistics are optimal theoretically, but they cannot be computed recursively, and their exact computation usually requires infinite memory of historical data. We prove the nearly second-order optimality by making a connection between sequential analysis and online convex optimization and leveraging the logarithmic regret bound property of online mirror descent algorithm. Numerical and real data examples validate our theory.
ARTICLE | doi:10.20944/preprints202310.1947.v1
Subject: Physical Sciences, Atomic And Molecular Physics Keywords: Cold atom; Ramsey spectrum; virtual instrument; sequential control
Online: 31 October 2023 (04:30:41 CET)
In this paper, a study was conducted on the Ramsey interference and spectral line locking techniques based on rubidium-87 cold atoms. Using sequential control technique, the timing design of a three-dimensional magneto-optical trap (MOT), cold atoms’ free fall, Ramsey microwave interference, and two-level fluorescence detection was realized, achieving the state preparation of cold atoms and clock transitions. By designing an automatic peak-finding algorithm, numerical integration was performed on the time-of-flying (TOF) signal, and narrow linewidth cold atom Ramsey interference fringes were obtained. Based on this, a frequency-hopping method was used to lock the frequency of the 10MHz oven-controlled crystal oscillator (OCXO) in a closed loop, achieving a linewidth of 38Hz. This preliminary result verifies the feasibility of the system design.
ARTICLE | doi:10.20944/preprints201805.0439.v1
Subject: Medicine And Pharmacology, Pharmacology And Toxicology Keywords: neem leaves; sequential pressurized liquid extraction; antiproliferative activity
Online: 30 May 2018 (07:45:25 CEST)
Azadirachta indica A. Juss (neem) extracts have been used in pharmaceutical applications as antitumor agents, due to their terpenes and phenolic compounds. To obtain extracts from neem leaves with potential antiproliferative effect, a sequential process of pressurized liquid extraction was carried out in a fixed bed extractor at 25 ºC and 100 bar, using hexane (SH), ethyl acetate (SEA) and then ethanol (SE) as solvents. An extraction using only ethanol (EE) was also conducted to compare the characteristics of the fractionated extracts. The results obtained by liquid chromatography-electrospray ionization mass spectrometry suggested the highest concentration of terpenes for SEA extract in comparison to SH, SE and EE extracts. Therefore, antiproliferative activity showed SEA extracts were the most efficient inhibitors to human tumor cells MCF-7, NCI-H460, HeLa, and HepG2 between all other extracts studied. However, hepatocellular normal cells were more resistant to SH, SEA, SE, and EE compared to malignant cells of breast, lung, hepatocellular, and cervical. Neem fractioned extracts obtained in the present study seem to be more selective for malignant cells compared to the normal cells.
ARTICLE | doi:10.20944/preprints201702.0061.v1
Subject: Engineering, Electrical And Electronic Engineering Keywords: multi-target tracking; multi-Bernoulli filter; sequential Monte-Carlo
Online: 16 February 2017 (09:39:29 CET)
We develop an interactive likelihood (ILH) for sequential Monte-Carlo (SMC) methods for image-based multiple target tracking applications. The purpose of the ILH is to improve tracking accuracy by reducing the need for data association. In addition, we integrate a recently developed deep neural network for pedestrian detection along with the ILH with a multi-Bernoulli filter. We evaluate the performance of the multi-Bernoulli filter with the ILH and the pedestrian detector in a number of publicly available datasets (2003 PETS INMOVE, AFL, and TUD-Stadtmitte) using standard, well-known multi-target tracking metrics (OSPA and CLEAR MOT). In all datasets, the ILH term increases the tracking accuracy of the multi-Bernoulli filter.
ARTICLE | doi:10.20944/preprints202304.0381.v1
Subject: Physical Sciences, Astronomy And Astrophysics Keywords: Oscillating Sequential Universes Model; Universe's Evolution; dark energy; dark matter
Online: 17 April 2023 (03:22:11 CEST)
Oscillating Sequential Universes Model (OSUM), an innovative cosmological model describing the universe's evolution through the interplay of dark energy and dark matter is presented in this study. The model proposes a series of expansions and contractions driven by dark energy and dark matter interactions, challenging the conventional view of a singular Big Bang event followed by continuous expansion. The necessity for model refinements, incorporating new observational constraints, reassessing dark energy-dark matter interaction assumptions, and investigating alternative mathematical formulations for the oscillation mechanism have been discussed in this study. The OSUM offers a comprehensive framework for future research and potential discoveries in astrophysics, cosmology, and fundamental physics, raising new questions and challenging prevailing assumptions about the cosmos.
COMMUNICATION | doi:10.20944/preprints202212.0261.v1
Subject: Environmental And Earth Sciences, Geophysics And Geology Keywords: Active remote sensing; Sequential rotationally excited; Circularly Polarized; GPR; SNR
Online: 15 December 2022 (03:03:26 CET)
As an effective active remote sensing (ARS) technology for shallow underground targets, ground penetrating radar (GPR) is a detection method to obtain the characteristic information of underground targets by transmitting electromagnetic wave from the antenna and analyzing the propagation law of the electromagnetic wave underground. Due to the high frequency(1MHz-3GHz) of GPR, the depth of geological exploration is shallow(0.1m-30m). In order to remote sensing the deep earth, it is necessary to increase the size of the radiation source in order to reduce the radiation frequency. At the same time, for most separated GPRs, a single dipole antenna (SD) is used as the radiation source and another antenna device placed along the electromagnetic wave propagation direction in the far field as a remote sensing sensor (RSS), both of which are horizontally linearly polarized (LP) antennas. In some cases, such a design is apt to cause problems such as multipath effect (ME) and polarization mismatch (PM). When GPR in ARS of deep earth is performed, it often results in increased errors, signal attenuation during data reception and processing. In contrast, at the radiation source, with the use of large aperture multiple-dipole antennas (MD) and multi-channel sequential rotational excitation, the electromagnetic wave can radiate outward in the form of circular polarization at a low frequency. At the RSS, the trouble caused by ME and PM can be reduced even if the LP antennas are used. A novel sequential rotationally excited (SRE) circularly polarized (CP) MD array for separated GPR in ARS of deep earth is proposed in this paper, which uses a large aperture CP MD array instead of a small size LP SD. The analysis and simulation results demonstrate that under the premise of the same transmitting power, comparing circular polarization and linear polarization, by using SRE CP MD antennas array radiation source, a significant enhancement (about 7dB) of the Signal to Noise Ratio (SNR) will occur by collecting the radiant energy at the RSS. More importantly, by reducing the exploration frequency to 10KHz, the exploration depth will also be greatly increased by about 10 times.
ARTICLE | doi:10.20944/preprints202010.0359.v1
Subject: Chemistry And Materials Science, Biomaterials Keywords: β-iPP; simultaneous biaxial stretching; sequential biaxial stretching; structure evolution
Online: 19 October 2020 (08:07:59 CEST)
In this paper, the lamellar structural evolution and microvoids variations of β-iPP during the processing of two different stretching methods, sequential biaxial stretching and simultaneous biaxial stretching, were investigated in detail. It was found that different stretching methods led to significantly different lamellae deformation modes, and the microporous membranes obtained from the simultaneous biaxial stretching exhibited better mechanical properties. For the sequential biaxial stretching, abundant coarse fibers originated from the tight accumulation of the lamellae parallel to the longitudinal stretching direction, whereas the lamellae perpendicular to the stretching direction were easily deformed and separated. Those coarse fibers were difficult to be separated to form micropores during the subsequent transverse stretching process, resulting in a poor micropores distribution. However, for the simultaneous biaxial stretching, the β crystal had the same deformation mode, that is, the lamellae distributed in different directions were all destroyed, forming abundant microvoids and little coarse fibers formation.
ARTICLE | doi:10.20944/preprints201705.0119.v1
Subject: Business, Economics And Management, Econometrics And Statistics Keywords: subunit distribution; structural analysis; statistical hypothesis; parameter estimation; sequential test
Online: 16 May 2017 (07:41:47 CEST)
Studies on the structure of economic systems are, most frequently, carried out by the methods of informational statistics. These methods, often accompanied by a wide range of indicators (Shannon entropy, Balassa coefficient, Herfindahl specialty index, Gini coefficient, Theil index etc.) around which a wide literature has been created over time, have a major disadvantage. Such weakness is related to the imposition of the system condition, therefore the need to know all the components of the system (as absolute values or as weights). This restriction is difficult to accomplish in some situations, and in others, this knowledge may be irrelevant, especially when there is an interest in structural changes only in some of the components of the economic system (either we refer to the typology of economic activities - NACE or of territorial units – NUTS). This article presents a procedure for characterizing the structure of a system and for comparing its evolution over time, in the case of incomplete information, thus eliminating the restriction existent in the classical methods. The proposed methodological alternative uses a parametric distribution, with subunit values for the variable. The application refers to Gross Domestic Product values for five of the 28 European Union countries, with annual values of over 1,000 billion Euros (Germany, Spain, France, Italy and United Kingdom) for the years 2003 and 2015. A form of the Wald sequential test is applied to measure changes in the structure of this group of countries, between the years compared. The results of this application validate the proposed method.
ARTICLE | doi:10.20944/preprints202305.0904.v1
Subject: Computer Science And Mathematics, Probability And Statistics Keywords: Complete convergence; r-quick convergence; sequential analysis; hypothesis testing; changepoint detection
Online: 12 May 2023 (08:25:35 CEST)
In the first part of this article, we discuss and generalize the complete convergence introduced by Hsu and Robbins (1947) to the r-complete convergence introduced by Tartakovsky (1998). We also establish its relation to the r-quick convergence first introduced by Strassen (1967) and extensively studied by Lai (1976). Our work is motivated by various statistical problems, mostly in sequential analysis. As we show in the second part, generalizing and studying these convergence modes is important not only in probability theory but also to solve challenging statistical problems in hypothesis testing and changepoint detection for general stochastic non-i.i.d. models.
Subject: Engineering, Control And Systems Engineering Keywords: error-correction coding, polar codes, convolutional codes, list decoding, sequential decoding
Online: 11 May 2021 (10:42:11 CEST)
Polar coding gives rise to the first explicit family of codes that provably achieve capacity with efficient encoding and decoding for a wide range of channels. Recently, Arikan presented a new polar coding scheme, which he called polarization-adjusted convolutional (PAC) codes. At short blocklengths, such codes offer a dramatic improvement in performance as compared to CRC-aided list decoding of conventional polar codes. PAC codes are based primarily upon the following main ideas: replacing CRC codes with convolutional precoding (under appropriate rate profiling) and replacing list decoding by sequential decoding. Simulation results show that PAC codes, resulting from the combination of these ideas, are close to finite-length lower bounds on the performance of any code under ML decoding. One of our main goals in this paper is to answer the following question: is sequential decoding essential for the superior performance of PAC codes? We show that similar performance can be achieved using list decoding when the list size $L$ is moderately large (say, $L \geq 128$). List decoding has distinct advantages over sequential decoding in certain scenarios, such as low-SNR regimes or situations where the worst-case complexity/latency is the primary constraint. Another objective is to provide some insights into the remarkable performance of PAC codes. We first observe that both sequential decoding and list decoding of PAC codes closely match ML decoding thereof. We then estimate the number of low weight codewords in PAC codes, and use these estimates to approximate the union bound on their performance. These results indicate that PAC codes are superior to both polar codes and Reed-Muller codes. We also consider random time-varying convolutional precoding for PAC codes, and observe that this scheme achieves the same superior performance with constraint length as low as $\nu = 2$.
REVIEW | doi:10.20944/preprints202308.1728.v1
Subject: Computer Science And Mathematics, Computer Science Keywords: Recommendation systems; collaborative filtering; sequential patterns; e-commerce; purchase and clickstream history
Online: 24 August 2023 (09:45:51 CEST)
E-commerce recommendation systems usually deal with massive customer sequential databases, such as historical purchase or click stream sequences. Recommendation systems’ accuracy can be improved if complex sequential patterns of user purchase behavior are learned by integrating sequential patterns of customer clicks and/or purchases into the user-item rating matrix input of collaborative filtering. This reviews focuses on algorithmic techniques of existing E-commerce recommendation systems that are sequential pattern based such as ChoRec05, ChenRec09, HuangRec09, LiuRec09, ChoiRec12, Hybrid Model RecSys16, Product RecSys16, SainiRec17, HPCRec18 and HSPCRec19. It provides a comprehensive and comparative performance analysis of these systems, exposing their methodologies, achievements, limitations, and potentials for solving more important problems in this domain. The review showed that integrating sequential pattern mining of historical purchase and/or click sequences into user-item matrix for collaborative filtering (i) improved recommendation accuracy (ii) reduced user-item rating data sparsity (iii) increased novelty rate of recommendations and (iv) improved scalability of the recommendation system.
REVIEW | doi:10.20944/preprints202308.0379.v1
Subject: Chemistry And Materials Science, Organic Chemistry Keywords: Bimetallic catalysis; multimetallic catalysis; sequential catalysis; N-heterocycles; transition metals; green chemistry
Online: 4 August 2023 (14:50:41 CEST)
Bimetallic (or multimetallic) catalysis has emerged as a powerful tool in modern chemical synthesis, offering improved reaction control and versatility. This review focuses on the recent de-velopments in bimetallic sequential catalysis for the synthesis of nitrogen heterocycles, essential building blocks in pharmaceuticals and fine chemicals. The cooperative action of two (and sometimes more) different metal catalysts enables intricate control over reaction pathways, enhancing selectivity and efficiency of N-heterocyclic compounds synthesis. By activating less reactive substrates, this multimetal catalytic strategy opens new synthetic possibilities for challenging compounds. The use of catalytic materials in bimetallic systems reduces waste and improves atom efficiency, aligning with green chemistry principles. With a diverse range of metal combinations and reaction conditions, bimetallic catalysis provides access to a broad array of N-heterocyclic compounds with various functionalities. This paper highlights the significant progress made in the past decade in this topic, emphasizing the promising potential of bimetallic catalysis in drug discovery and the fine chemical industries.
ARTICLE | doi:10.20944/preprints202104.0298.v1
Subject: Computer Science And Mathematics, Algebra And Number Theory Keywords: Quantum games; classical games; sequential quantum games; groups of actions; winning strategies
Online: 12 April 2021 (12:45:52 CEST)
This paper studies sequential quantum games under the assumption that the moves of the players are drawn from groups and not just plain sets. The extra group structure makes possible to easily derive some very general results, which, to the best of our knowledge, are stated in this generality for the first time in the literature. The main conclusion of this paper is that the specific rules of a game are absolutely critical. The slightest variation may have important impact on the outcome of the game. It is the combination of two factors that determine who wins: (i) the sets of admissible moves for each player, and (ii) the order of moves, i.e., whether the same player makes the first and the last move. Quantum strategies do not a priori prevail over classical strategies. By carefully designing the rules of the game the advantage of either player can be established. Alternatively, the fairness of the game can also be guaranteed.
ARTICLE | doi:10.20944/preprints202006.0204.v1
Subject: Medicine And Pharmacology, Gastroenterology And Hepatology Keywords: Presepsin; Sepsis; Sequential Organ Failure Assessment (SOFA) score; alkaline phosphatase (ALP); Bile
Online: 16 June 2020 (09:43:45 CEST)
Presepsin is a diagnostic and prognostic biomarker of sepsis; however, elevated presepsin levels have also been documented without sepsis. This study aims to retrospectively analyze the laboratory parameters and Sequential Organ Failure Assessment (SOFA) score affecting presepsin levels in 567 patients. Some patients with elevated presepsin levels exhibited renal dysfunction or elevation of biliary enzymes despite a low SOFA score. The univariate regression analysis revealed a close correlation between presepsin levels and SOFA score, serum creatinine (CRE), blood urea nitrogen, and biliary enzymes. In addition, a multivariate regression analysis revealed that SOFA score, alkaline phosphatase (ALP), and CRE independently affected presepsin levels significantly. The analysis of covariance (ANCOVA) revealed that presepsin levels were significantly higher in patients with hepatobiliary disease. Besides, we found that patients who presented with the dilatation of intra- or extrahepatic bile ducts and the elevation of ALP or total bilirubin exhibited remarkable high presepsin levels in the bile. Furthermore, the presepsin production in the liver’s Kupffer cells was established by immunostaining in patients who received surgical liver resection. Overall, this study elucidates that biliary enzymes’ elevation affects presepsin levels, presepsin exists in high concentrations in the bile, and is positive in Kupffer cells.
ARTICLE | doi:10.20944/preprints201812.0058.v1
Subject: Engineering, Mechanical Engineering Keywords: big data; parameter estimation; model updating; system identification; sequential Monte Carlo sampler
Online: 4 December 2018 (11:17:24 CET)
In this paper the authors present a method which facilitates computationally efficient parameter estimation of dynamical systems from a continuously growing set of measurement data. It is shown that the proposed method, which utilises Sequential Monte Carlo samplers, is guaranteed to be fully parallelisable (in contrast to Markov chain Monte Carlo methods) and can be applied to a wide variety of scenarios within structural dynamics. Its ability to allow convergence of one's parameter estimates, as more data is analysed, sets it apart from other sequential methods (such as the particle filter).
ARTICLE | doi:10.20944/preprints202306.1270.v1
Subject: Engineering, Civil Engineering Keywords: lightweight structures; bubble deck concrete slabs; numerical homogenization; weight minimization; sequential quadratic programming
Online: 19 June 2023 (03:17:58 CEST)
In engineering practice, one can often encounter issues related to optimization where the goal is to minimize material consumption, minimize stresses or deflections of the structure. In most cases, these issues are addressed with finite element analysis software and simple optimization algorithms. However, in the case of optimization of certain structures, it is not co straightforward. An example of such constructions are bubble deck ceilings, where in order to reduce the dead weight, air cavities are used, which are regularly arranged over the entire surface of the ceiling. In the case of these slabs, the flexural stiffness is not constant in all its cross-sections, which means that the use of structural finite elements (plate or shell) for static calculations is not possible, and therefore the optimization process becomes more difficult. The paper presents a minimization procedure of the weight of bubble deck slabs using numerical homogenization and sequential quadratic programming with constraints. Homogenization allows to determine the effective stiffnesses of the floor, which in the next step are sequentially corrected by changing the geometrical parameters of the floor and voids in order to achieve the assumed deflection. The presented procedure allows to minimize the use of material in a quick and effective way by automatically determining the optimal parameters describing the geometry of the bubble deck floor cross-section.
ARTICLE | doi:10.20944/preprints202305.2267.v1
Subject: Chemistry And Materials Science, Analytical Chemistry Keywords: N-acetyl-L-cysteine; flow injection analysis; sequential injection analysis; spectrophotometric determination; pharmaceuticals
Online: 31 May 2023 (14:37:59 CEST)
N-acetyl-L-cysteine (NAC), a sulfhydryl-containing compound, is mainly used as a mucolytic and as an antidote for acetaminophen overdose. Flow injection and sequential injection systems were designed and optimized with the aim of providing precise, accurate and reliable flow methods for NAC determination in pharmaceuticals with very low sample and reagent consumption. Proposed methods are based on a redox reaction wherein NAC reduces a complex of Cu(II) and bathocuproine disulfonate (BCS) to orange [Cu(BCS)2]3– complex, which absorption was measured at λmax = 483 nm. The optimized FIA and SIA configuration yielded a linear calibration curve with correlation coefficients (R2 = 0.9999 and R2 = 0.9996) in the concentration range of 3.0 × 10–7 – 3.0 × 10–5 mol L–1 and analytical frequency of 120 h–1 for the FIA method and 4.0 × 10–7 – 4.0 × 10–5 mol L–1, at sampling rate 60 h–1 for the SIA method. The proposed flow methods were successfully applied for the determination of NAC in pharmaceutical products, as the results showed good agreement with the standard method prescribed by Pharmacopoeia. Recoveries were in the range from 98.4% to 101.9% for the FIA method and from 97.2% to 101.8% for the SIA method.
ARTICLE | doi:10.20944/preprints201812.0317.v1
Subject: Biology And Life Sciences, Virology Keywords: universal influenza vaccine; chimeric hemagglutinin; nucleoprotein; live attenuated influenza vaccine; sequential immunization; ferret model
Online: 27 December 2018 (10:14:21 CET)
The development of universal influenza vaccines, i.e. vaccines that can provide broad protection against seasonal and potentially pandemic influenza viruses, has been a priority for more than 20 years. Several approaches have been proposed that redirect the adaptive immune responses from immunodominant hypervariable regions to low-immunogenic but highly conserved regions of viral proteins. Here we induced broadly reactive anti-hemagglutinin (HA) stalk antibody by sequential immunizations with live attenuated influenza vaccines (LAIVs) expressing chimeric HA (cHA). These vaccines contained the HA stalk domain from H1N1pdm09 virus but antigenically unrelated globular head domains from avian influenza viruses H5N1, H8N4 and H9N2. In addition, the source of the viral nucleoprotein (NP) of the LAIV strains was changed from A/Leningrad/17 master donor virus (MDV) to wild-type (WT) H1N1pdm09 virus, in order to induce CD8 T-cell immune responses more relevant to current infections. To avoid any difference in protective effect of the various anti-neuraminidase (NA) antibodies, all LAIVs were engineered to contain the NA gene of Len/17 MDV. Naïve ferrets were immunized with three doses of (i) classical LAIVs containing non-chimeric HA and NP from MDV (LAIVs (NP-MDV)); (ii) cHA-based LAIVs containing NP from MDV (cHA LAIVs (NP-MDV)); and (iii) cHA-based LAIVs containing NP from H1N1pdm09 virus (cHA LAIVs (NP-WT)). A high-dose challenge with H1N1pdm09 virus induced significant pathology in the control, non-immunized ferrets, including high virus titers in respiratory tissues, clinical signs of disease and histopathological changes in nasal turbinates and lung tissues. All three vaccination regimens protected animals from clinical manifestations of disease: immunized ferrets did not lose weight or show clinical symptoms, and their fever was significantly lower than in the control group. Further analysis of virological and pathological data revealed the following hierarchy in the cross-protective efficacy of the vaccines: cHA LAIVs (NP-WT) > cHA LAIVs (NP-MDV) > LAIVs (NP-MDV). This ferret study showed that prototype universal LAIVs that combine the two approaches of inducing anti-HA stalk antibody and more relevant CD8 T-cell immune responses are highly promising candidates for further clinical development.
ARTICLE | doi:10.20944/preprints202311.1973.v1
Subject: Medicine And Pharmacology, Hematology Keywords: FLAMSA; allogeneic transplant (aHSCT); sequential conditioning; AML; MDS; myeloid malignancies; reduced-intensity conditioning; RIC; Venetoclax
Online: 30 November 2023 (10:24:31 CET)
Up to 50% of patients with high-risk myeloid malignancies die of relapse after allogeneic stem cell transplantation. Current sequential conditioning regimens like the FLAMSA protocol combine intensive induction therapy with TBI or alkylators. Venetoclax has synergistic effects to chemotherapy. In a retrospective survey among German transplant centers, we identified 61 patients with myeloid malignancies that had received FLAMSA based sequential conditioning with Venetoclax between 2018 and 2022 as an individualized treatment approach. Sixty patients (98%) had active disease at transplant and 74% had genetic high-risk features. Patients received allografts from matched unrelated, matched related or mismatched donors. Tumor lysis syndrome occurred in two patients but no significant non-hematologic toxicity related to Venetoclax was observed. At day +30, 55 patients (90%) were in complete remission. Acute GvHD II°-IV° occurred in 17 (28%) and moderate/severe chronic GvHD in 7 patients (12%). Event-free survival and overall survival were 64% and 80% at 1 year as well as 57% and 75% at 2 years, respectively. The combination of sequential FLAMSA-RIC with Venetoclax appears to be safe and highly effective. To further validate these insights and enhance the idea of smart conditioning, a controlled prospective clinical trial has been initiated in 07/2023.
ARTICLE | doi:10.20944/preprints201903.0199.v1
Subject: Computer Science And Mathematics, Geometry And Topology Keywords: Generalized topological space, generalized compact- ness, generalized countable compactness, generalized sequential compactness, generalized local connectedness
Online: 20 March 2019 (15:38:12 CET)
Several speciﬁc types of generalized compactness of generalized topological spaces have been deﬁned, investigated and related to compactness in ordinary topological spaces from time to time in the literature of topological spaces. Our recent research in the ﬁeld of a new class of generalized compactness in a generalized topological space is reported herein as a starting point for more generalized classes.
ARTICLE | doi:10.20944/preprints202012.0315.v1
Subject: Computer Science And Mathematics, Computer Science Keywords: Intrusion Detection Systems; Anomaly detection; Sequential analysis; Random Forest; Multi-Layer Perceptron; Long-Short Term Memory
Online: 14 December 2020 (09:36:58 CET)
With the latest advances in information and communication technologies, greater amounts of sensitive user and corporate information are constantly shared across the network making it susceptible to an attack that can compromise data confidentiality, integrity and availability. Intrusion Detection Systems (IDS) are important security mechanisms that can perform a timely detection of malicious events through the inspection of network traffic or host-based logs. Throughout the years, many machine learning techniques have proven to be successful at conducting anomaly detection but only a few considered the sequential nature of data. This work proposes a sequential approach and evaluates the performance of a Random Forest (RF), a Multi-Layer Perceptron (MLP) and a Long-Short Term Memory (LSTM) on the CIDDS-001 dataset. The resulting performance measures of this particular approach are compared with the ones obtained from a more traditional one, that only considers individual flow information, in order to determine which methodology best suits the concerned scenario. The experimental outcomes lead to believe that anomaly detection can be better addressed from a sequential perspective and that the LSTM is a very reliable model for acquiring sequential patterns in network traffic data, achieving an accuracy of 99.94% and a f1-score of 91.66%.
ARTICLE | doi:10.20944/preprints201809.0220.v1
Subject: Biology And Life Sciences, Forestry Keywords: sequential chemical extraction; 31P-nuclear magnetic resonance spectroscopy (31P-NMR); phosphorus; coastal sand dune; Casuarina forests
Online: 12 September 2018 (12:35:31 CEST)
Continuous research into the availability of phosphorus (P) in forest soil is critical for sustainable management of forest ecosystems. In this study, we used sequential chemical extraction and 31P-nuclear magnetic resonance spectroscopy (31P-NMR) to evaluate the form and distribution of inorganic P (Pi) and organic P (Po) in Casuarina forest soils of a subtropical coastal sand dune at Houlong in Taiwan. The soil samples were collected from humic (+2-0 cm) and mineral layers (mineral-I: 0-10, mineral-II: 10-20 cm) at two topographic locations (upland and lowland) by elevation. Sequential chemical extraction revealed that the NaOH-Po fraction, as moderately recalcitrant P, was the dominant form in humic and mineral-I layers in both upland and lowland soils, whereas the cHCl-Pi fraction was the dominant form in the mineral-II layer. Resistant P content, including NaOH-Pi, HCl-Pi, cHCl-Pi, and cHCl-Po fractions, was higher in the upland than lowland in the corresponding layers; however, labile P content, NaHCO3-Po, showed the opposite pattern. Content of resistant Pi (NaOH-Pi, HCl-Pi, and cHCl-Pi) increased significantly with depth, but that of labile Pi (resin-Pi and NaHCO3-Pi) and recalcitrant Po (NaHCO3-Po, NaOH-Po, and cHCl-Po) decreased significantly with depth at both locations. 31P-NMR spectroscopy revealed inorganic orthophosphate and monoesters-P as the major forms in this area. The proportions of Pi and Po evaluated by sequential chemical extraction and 31P-NMR spectroscopy were basically consistent. The results indicated that the soils were in weathered conditions. Furthermore, the P distribution and forms significantly differed between the upland and lowland by variation in elevation and eolian aggradation effects in this coastal sand dune landscape.
ARTICLE | doi:10.20944/preprints201810.0073.v1
Subject: Medicine And Pharmacology, Other Keywords: Classification; F-score; Gray-Level Co-occurrence Matrix (GLCM); Gray-Level Run-Length Matrix (GLRLM); Hepatocellular Carcinoma (HCC); Liver Cancer; Liver Abscess; Image Texture, Sequential Backward Selection (SBS); Sequential Forward Selection (SFS); Support Vector Machine (SVM); Ultrasound Image.
Online: 4 October 2018 (14:01:42 CEST)
This paper discusses the computer-aided (CAD) classification between Hepatocellular Carcinoma (HCC), i.e., the most common type of liver cancer, and Liver Abscess, based on ultrasound image texture features and Support Vector Machine (SVM) classifier. Among 79 cases of liver diseases, with 44 cases of HCC and 35 cases of liver abscess, this research extracts 96 features of Gray-Level Co-occurrence Matrix (GLCM) and Gray-Level Run-Length Matrix (GLRLM) from the region of interests (ROIs) in ultrasound images. Three feature selection models, i) Sequential Forward Selection, ii) Sequential Backward Selection, and iii) F-score, are adopted to determine the identification of these liver diseases. Finally, the developed system can classify HCC and liver abscess by SVM with the accuracy of 88.875%. The proposed methods can provide diagnostic assistance while distinguishing two kinds of liver diseases by using a CAD system.
ARTICLE | doi:10.20944/preprints202310.1653.v1
Subject: Biology And Life Sciences, Ecology, Evolution, Behavior And Systematics Keywords: adaptation; ancestor-descendant; Bayes factors; bryophytes; complexity; evolution; likelihood ratio; monothetic genus; sequential Bayes; Shannon information bits; Tainoa
Online: 25 October 2023 (12:56:53 CEST)
Detailed evaluation is provided for the likelihood statistics intrinsic to interlocking Sequential Bayes analysis, which allows estimation of evidential support for dendrograms charting the macroevolution of taxa. It involves complexity functions, such as fractal evolution, to generate well-supported evolutionary trees. Required are data on trait changes from ancestral species to descendant species, which is facilitated by reduction of large genera to monothetic groups (one ancestral species each), most conveniently named as separate genera. Assigning each species one Shannon informational bit per new trait, then adding the traits as sequential Bayesian analysis (each posterior probability used as the prior for the next iteration of Bayes Formula), then interpretation with an odds chart, provides a posterior probability that the ancestor-descendant order is correct, that is, entirely follows evolutionary theory involving adaptation and rarity of total trait reversals. The key fact is that the most recently acquired traits of the ancestral species are selectively inviolate and passed on without change to each descendant species. The details of sequential Bayesian analysis were clarified by calculation of likelihood ratios and Bayes factors that compare optimal models with the next most likely model. Such analysis demonstrated that the optimum arrangement of ancestor and descendant species leads to high support values for fitting evolutionary theory, comparable to statistical support levels reported for molecular evolutionary trees. The fact that phylogenetic analysis does not offer an ancestor-descendant model in is found to be conducive of precision at the expense of accuracy, and leads to wrong ancestor-free models with likelihoods explaining data at high credible levels. The number of advanced traits in the outgroup as a Bayesian prior greatly enhances posterior probability. The method is simple, free of special computer analysis, and well-suited to standard taxonomic practice.
ARTICLE | doi:10.20944/preprints201709.0009.v1
Subject: Environmental And Earth Sciences, Remote Sensing Keywords: machine learning; sequential maximum a posteriori; random forest; support vector machines; land use classification; textural information; contextual information
Online: 4 September 2017 (10:43:26 CEST)
The aim of this study is to evaluate three different strategies to improve classification accuracy in a highly fragmented semiarid area. i) Using different classification algorithms: Maximum Likelihood, Random Forest, Support Vector Machines and Sequential Maximum a Posteriori, with parameter optimisation in the second and third cases; ii) using different feature sets: spectral features, spectral and textural features, and spectral, textural and terrain features; and iii) using different image-sets: winter, spring, summer, autumn, winter+summer, winter+ spring+summer; and a four seasons combination. A 3-way ANOVA is used to discern which of these approaches and their interactions significantly increases accuracy. Tukey-Kramer contrast using a heteroscedasticity-consistent estimation of the kappa covariances matrix was used to check for significant differences in accuracy. The experiment was carried out with Landsat TM, ETM, and OLI images corresponding to the period 2000-2015. A combination of four images was the best way to improve accuracy. Maximum Likelihood, Random Forest and Support Vector Machines do not significantly increase accuracy when textural information is added, but do so when terrain features are taken into account. On the other hand, Sequential Maximum a Posteriori increases accuracy when textural features are used, but reduces accuracy substantially when terrain features are included. Random Forest using the three feature subsets and Sequential Maximum a Posteriori with spectral and textural features had the largest kappa values, around 0.9.
ARTICLE | doi:10.20944/preprints202005.0444.v1
Subject: Computer Science And Mathematics, Computational Mathematics Keywords: restricted Boltzmann machine; contrastive divergence; extreme learning machine; online sequential extreme learning machine; autoencoders; deep belief network; deep learning
Online: 27 May 2020 (08:18:39 CEST)
Abstract: The main contribution of this paper is to introduce a new iterative training algorithm for restricted Boltzmann machines. The proposed learning path is inspired from online sequential extreme learning machine one of extreme learning machine variants which deals with time accumulated sequences of data with fixed or varied sizes. Recursive least squares rules are integrated for weights adaptation to avoid learning rate tuning and local minimum issues. The proposed approach is compared to one of the well known training algorithms for Boltzmann machines named “contrastive divergence”, in term of time, accuracy and algorithmic complexity under the same conditions. Results strongly encourage the new given rules during data reconstruction.
ARTICLE | doi:10.20944/preprints201608.0218.v1
Subject: Engineering, Control And Systems Engineering Keywords: micro-surface imaging; evaluation; nondestructive; enhancement techniques; thin plate spline; linear sequential estimation; windowing technique, Lenna image, MEMS imaging
Online: 27 August 2016 (10:43:36 CEST)
This article develops algorithms for the characterization and the visualization of micro-scale features by using a small number of sample points, and with a goal to mitigate for the measurement shortcomings which are often destructive or time consuming. We implement the algorithms to rapidly examine the microscopic features of a Microelectromechanical System (MEMS) surface. Such images are highly dense; therefore, traditional image processing techniques might be computationally expensive. The contribution of this research include first, we develop local and global algorithm based on modified Thin Plate Spline (TPS) model to reconstruct high resolution images of the micro-surface’s topography, and its derivatives by using low resolution images. Second, we obtain a bending energy algorithm from our modified TPS model, and use it to filter out image defects. Finally, we develop a computationally efficient Windowing technique, which combines TPS and Linear Sequential Estimation (LSE), to enhance the visualization of images. The Windowing technique allows rapid image reconstruction based on the reduction of inverse problem.
ARTICLE | doi:10.20944/preprints202108.0018.v1
Subject: Physical Sciences, Radiation And Radiography Keywords: deep reinforcement learning; source search and localization; active search; gamma radiation; source parameter estimation; sequential decision making; non-convex environment}
Online: 2 August 2021 (11:14:24 CEST)
Rapid search and localization for nuclear sources can be an important aspect in preventing human harm from illicit material in dirty bombs or from contamination. In the case of a single mobile radiation detector, there are numerous challenges to overcome such as weak source intensity, multiple sources, background radiation, and the presence of obstructions, i.e., a non-convex environment. In this work, we investigate the sequential decision making capability of deep reinforcement learning in the nuclear source search context. A novel neural network architecture (RAD-A2C) based on the actor critic (A2C) framework and a particle filter gated recurrent unit for localization is proposed. Performance is studied in a randomized 20 x 20 m convex and non-convex environment across a range of signal-to-noise ratio (SNR)s for a single detector and single source. RAD-A2C performance is compared to both an information-driven controller that uses a bootstrap particle filter and to a gradient search (GS) algorithm. We find that the RAD-A2C has comparable performance to the information-driven controller across SNR in a convex environment and at lower computational complexity per action. The RAD-A2C far outperforms the GS algorithm in the non-convex environment with greater than 95% median completion rate for up to seven obstructions.
ARTICLE | doi:10.20944/preprints202312.0190.v1
Subject: Engineering, Architecture, Building And Construction Keywords: two-dimensional (2D) orthomosaic; drone video processing; real-time image registration; sequential registration; scale-invariant feature transform (SIFT); fast library for approximate nearest neighbors (FLANN); random sample consensus (RANSAC)
Online: 4 December 2023 (11:14:13 CET)
This study presents a method for the rapid and accurate generation of two-dimensional (2D) orthomosaic maps using selected image data collected by drone-captured video. The focus is on developing a real-time method capable of creating maps more quickly than image selection. The scale-invariant feature transform (SIFT) algorithm is applied to drone images to extract features in various scale regions. For feature point matching, a matching method based on the fast library for approximate nearest neighbors (FLANN) was adopted. A comparison of the computational speed of the FLANN with that of the k-nearest neighbors (KNN) and brute force matcher during the matching process revealed FLANN's superior capability for real-time data processing. The random sample consensus (RANSAC) algorithm was employed to enhance the accuracy of the matching by removing outliers, effectively identifying and eliminating mismatches, and reinforcing the reliability of feature point matching. The combination of SIFT, FLANN, and RANSAC algorithms demonstrates the capacity to process drone-captured image data in real time, facilitating the generation of precise 2D orthomosaic maps. The proposed method was assessed and validated using imagery obtained as the drone executed curvilinear and straight flight paths, confirming its accuracy and operational efficiency concurrently with the image capture process.