ARTICLE | doi:10.20944/preprints201901.0098.v1
Subject: Engineering, Industrial & Manufacturing Engineering Keywords: Additive Manufacturing; Electron Beam Melting; In-Process Monitoring; Quality Control; Electronic Imaging
Online: 10 January 2019 (11:52:10 CET)
Electron Beam Melting (EBM) is an increasingly used Additive Manufacturing (AM) technique employed by many industrial sectors, including the medical device and aerospace industries. The application of this technology is, however, challenged by the lack of process monitoring and control system that underpins process repeatability and part quality reproducibility. An electronic imaging system prototype has been developed to serve as an EBM monitoring technique, the capabilities of which have been verified at room temperature and at 320+10°C. Nevertheless, in order to fully assess the applicability of this technique, the image quality needs to be investigated at a range of elevated temperatures to fully understand the influence of thermal noise due to heat. In this paper, electronic imaging pilot trials at elevated temperatures, ranging from room temperature to , were carried out. Image quality measure Q of the digital electron images was evaluated, and the influence of temperature was investigated. In this study, raw electronic images generated at higher temperatures had greater Q values, i.e. better global image quality. It has been demonstrated that, for temperatures between , the influence of temperature on electronic image quality was not adversely affecting the visual clarity of image features. It is envisaged that the prototype has significant potential to contribute to in-process EBM monitoring in many manufacturing sectors.
ARTICLE | doi:10.20944/preprints201901.0169.v1
Subject: Engineering, Industrial & Manufacturing Engineering Keywords: electron beam melting; in-process monitoring; quality control; electronic imaging; spatial resolution
Online: 17 January 2019 (02:53:17 CET)
Electron Beam Melting (EBM) is an increasingly used Additive Manufacturing (AM) technique employed by many industrial sectors, including the medical device and aerospace industries. In-process EBM monitoring for quality assurance purposes has been a popular research area. Electronic imaging has recently been investigated as one of the in-process EBM data collection methods, alongside thermal/ optical imaging techniques. Despite certain capabilities of an electronic imaging system have been investigated, experiments are yet to be carried out to benchmark one of the most important features of any imaging systems – spatial resolution. This article addresses this knowledge gap by: (1) proposing an indicator for the estimation of spatial resolution which includes the Backscattered Electrons (BSE) information depth, (2) estimating the achievable spatial resolution when electronic imaging is carried out inside an Arcam A1 EBM machine, and (3) presenting an experimental method to conduct edge resolution evaluation with the EBM machine. Analyses of experimental results indicated that the spatial resolution was of the order of 0.3mm-0.4mm when electronic imaging was carried out at room temperature. It is believed that by disseminating an analysis and experimental method to estimate and quantify spatial resolution, this study has contributed to the on-going quality assessment research in the field of in-process monitoring of the EBM process.
ARTICLE | doi:10.20944/preprints202105.0121.v1
Subject: Engineering, Automotive Engineering Keywords: speckle photography; in-process measurement; deep rolling process
Online: 7 May 2021 (08:55:13 CEST)
In the concept of the process signature, the relationship between a material load and the modification remaining in the workpiece is used to better understand and optimize manufacturing processes. The basic prerequisite for this is to be able to measure the loads occurring during the machining process in the form of mechanical deformations. Speckle photography is suitable for this in-process measurement task and is already used in a variety of ways for in-plane deformation measurements. The shortcoming of this fast and robust measurement technique based on image correlation techniques is that out-of-plane deformations in the direction of the measurement system cannot be detected and increases the measurement error of in-plane deformations. In this paper, we investigate a method that infers local out-of-plane motions of the workpiece surface from the decorrelation of speckle patterns and is thus able to reconstruct three-dimensional deformation fields. The implementation of the evaluation method enables a fast reconstruction of 3D deformation fields, so that the in-process capability remains given. First measurements in a deep rolling process show that dynamic deformations underneath the die can be captured and demonstrate the suitability of the speckle method for manufacturing process analysis.
ARTICLE | doi:10.20944/preprints202205.0072.v1
Subject: Engineering, Industrial & Manufacturing Engineering Keywords: Electrical Resistance Tomography; Visualization; Crystallization process monitoring; process operation
Online: 6 May 2022 (10:20:02 CEST)
In the current research work, electrical resistance tomography (ERT) was employed for crystallization process monitoring and visualization. A first-of-its-kind MATLAB-based interactive GUI application “ERT-Vis” is presented. Two case studies involving varied crystallization methods were undertaken. The experiments involving calcium carbonate reactive (precipitative) crystallization for the high conductivity solution-solute media and the cooling crystallization for sucrose crystallization representing the lower conductivity solution-solute combination were designed and performed. The software successfully provided key insights regarding the process progress in both crystallization systems. It could detect and separate the crystal agglomerations in the low as well as high conductivity solutions using visual analytics tools provided. The performance and utility of the software were studied using a software evaluation case study involving domain experts. Participant feedback indicated that ERT-Vis software helps in reconstructing images instantaneously, interactively visualizing, and evaluating the output of the crystallization process monitoring data.
CASE REPORT | doi:10.20944/preprints201810.0465.v1
Subject: Engineering, Industrial & Manufacturing Engineering Keywords: production system; simulation manufacturing process; simulation model; work in process
Online: 22 October 2018 (04:56:28 CEST)
During the last decades, the production systems have developed different strategies to increase their competitiveness in the global market. In a manufacturing and services systems, Lean Manufacturing has been consolidated through the correct implementation of its tools. The present paper presents a case study developed in a Food Packer company where a Simulation Model was considered as an alternative to reduce the waste time generated by the poor distribution of operations and transportation areas for a product within the factory. As a matter of fact, the company has detected problems on the layout distribution that prevents to fulfill the market demand. In addition, the principal aim was to create a simulation model to test different hypothetical scenarios and alternative designs for the layout distribution without modifying its facilities. Moreover, the implemented methodology was based on classical models of simulation projects and a compendium of the manufacturing systems optimization by simulation process used during the last ten years. Also, a mathematic model supported by the Promodel ® simulation software was developed considering the company characteristics; along with the model development, it was possible to compare the production system performance from the percentage of used locations, the percentage of resources utilization, the number of finished products, and the level of Work in Process (WIP). Finally, the verification and validation stages were performed before running the scenarios in the real production area. The results generated by the implementation of the project represent an increase of 68% in the production capacity and a reduction of 5% in the WIP. In addition, both outcomes are associated with the resources management, which were reassigned to other production areas.
ARTICLE | doi:10.20944/preprints201803.0011.v1
Subject: Engineering, Industrial & Manufacturing Engineering Keywords: Oxygen monitoring; LED; Photoluminescence; Wine production; Fermentation process; Process control.
Online: 1 March 2018 (16:16:37 CET)
The importance of oxygen in the winemaking process is widely known, as it affects the chemical aspects and therefore the organoleptic characteristics of the final product. Hence, it is evident the usefulness of a continuous and real-time measurements of the levels of oxygen in the various stages of the winemaking process, both for monitoring and for control. The WOW project has focused on the design and the development an innovative device for monitoring the oxygen levels in wine. This system is based on the use of an optical fiber to measure the luminescent lifetime variation of a reference metal/porphyrin complex, which decays in presence of oxygen. The developed technology results in a high sensitivity and low cost sensor head that can be employed for measuring the dissolved oxygen levels at several points inside a wine fermentation or aging tank. This system can be complemented with dynamic modeling techniques to provide predictive behavior of the nutrient evolution in space and time given few sampled measuring points for both process monitoring and control purposes. The experimental validation of the technology has been first performed in a controlled laboratory setup to attain calibration and study sensitivity with respect to different photo-luminescent compounds and alcoholic or non-alcoholic solutions, and then in an actual case study during a measurement campaign at a renown Italian winery.
Online: 19 April 2019 (11:18:13 CEST)
This paper describes a multi-channel in-situ monitoring system developed to better understand defect formation signatures in metal additive manufacturing. Three high-speed imaging modes coupled with an image computer capable of processing and storing these data streams allowed an examination of defect formations signatures and mechanisms. It was found that defects later detected in X-ray computed tomography (CT) scans were related to regions with anomalous heat signatures and powder bed morphology. Automated defect detection algorithms based on these defect signatures captured 80% of defects greater than 300 µm.
ARTICLE | doi:10.20944/preprints201801.0140.v1
Subject: Social Sciences, Organizational Economics & Management Keywords: entrepreneurial sustainability strategy; system thinking; business process management; process improvements; innovation in higher education; sustainable organizational performance
Online: 16 January 2018 (10:44:40 CET)
The sustainable development of our world has gain particular attention of a wide range of decisional factors, civil society, business sector, and scientific community, seeing that the prosperity of people and society is possible with the aid of sustained and inclusive economic growth of all countries and regions. Educational environment has a decisive impact on changes in the way that societies are coping with national, regional, and global challenges and opportunities brought by sustainable development. Looking at the implications of HE on the progress of society, the paper addressed the lack of HE institutional capacity to integrate the principles and practices of sustainable development into all aspects of education and learning. The scope of research problem was bounded on the capability of HEI as organization and school to act as entrepreneurial university by combining the scope of its responsibility within the value chain through a practical and effective mechanism needed to align the strategy with sustainable development goals (SDGs). Embarking on the path of SDGs requires HEI to design, launch, implement, and customize specific processes architectures to govern the advance of sustainability approach. The authors applied the process scoping diagram to capture and conceptualize the educational model needed to guide the HEI through the process of change to embrace sustainability into organizational culture and daily operations. It has been used the SIPOC method (Supplier, Input, Process, Output, Customer) with Visio software tool to articulate processes relationships embedded in the educational model of HEI. The benefits relied on the organized view of the work processes needed to be performed to incorporate SDGs into the strategy of any entrepreneurial HEI. Finally, the authors shared their views on the scalability of the model which may be customized and harmonized in accordance with different HE circumstances and priorities. Implementing the proposed educational model requires long-term institutional commitment, transparency, continuous performance improvement, and communicating the strategy for SDGs and its achievements to wider stakeholders.
ARTICLE | doi:10.20944/preprints201908.0246.v1
Subject: Mathematics & Computer Science, Probability And Statistics Keywords: statistics; in silico experiments; neural simulation; gaussian process regression; Bayesian inference
Online: 23 August 2019 (11:27:23 CEST)
Many scientific and technological problems are studied with the help of computer codes that simulate the phenomena of interest rather than via traditional laboratory experiments. Such models play an important role in neuroscience where they are used to mimic brain function from the sub-cellular to the macroscopic level. Exploration with computer models carries with it a number of statistical challenges: where to sample the input space for the simulator, how to make sense of the data that is generated, how to estimate unknown parameters in the model, how to validate a model. The simulator setting also has some unique problems and possibilities. This review paper describes statistical research on these issues and how that work might be applied to neural simulations.
ARTICLE | doi:10.20944/preprints201809.0078.v1
Subject: Engineering, Energy & Fuel Technology Keywords: Ethanol; corn; dry-grind process; bolt-on process; corn fiber; soaking in aqueous ammonia pretreatment; cellulase; cellulosic ethanol.
Online: 5 September 2018 (01:40:11 CEST)
Corn fiber is a co-product of commercial ethanol dry-grind plants, which is processed into distillers dried grains with solubles (DDGS) and used as animal feed, yet it holds high potential to be used as feedstock for additional ethanol production. Due to the tight structural make-up of corn fiber, a pretreatment step is necessary to make the cellulose and hemicellulose polymers in the solid fibrous matrix more accessible to the hydrolytic enzymes. A pretreatment process was developed in which whole corn kernels were soaked in aqueous solutions of 2.5, 5.0, 7.5 and 10.0 wt% ammonia at 105oC for 24 h. The pretreated corn then was subjected to a conventional mashing procedure and subsequently ethanol fermentation using a commercial strain of natural Saccharomyces cerevisiae with addition of a commercial cellulase. Pretreatment of the corn with 7.5 wt% ammonia solution plus cellulase addition gave highest ethanol production, which improved the yield in fermentation using 25 wt% solid from 334 g ethanol/kg corn obtained in the control (no pretreatment and no cellulase addition) to 379 g ethanol/kg corn (a 14% increase). The process developed can potentially be implemented in existing dry-grind ethanol facilities as a “bolt-on” process for additional ethanol production from corn fiber, and this additional ethanol can then qualify as “cellulosic ethanol” by the EPA’s Renewable Fuels Standard and thereby receive RINS (Renewable Identification Numbers).
ARTICLE | doi:10.20944/preprints202104.0013.v1
Subject: Engineering, Civil Engineering Keywords: transportation infrastructure; concrete bridges; structural health monitoring; bridge condition index; analytical hierarchy process; prioritizing
Online: 1 April 2021 (11:14:27 CEST)
This paper proposes a method for monitoring the structural health of concrete bridges in Iran. In this method, the bridge condition index (BCI) of bridges is determined by the analytical hierarchy process. BCI constitutes eight indices that are scored based on the experts' views, including structural, hydrology and climate, safety, load impact, geotechnical and seismicity, strategic importance, facilities, and traffic and pavement. Experts' views were analyzed by Expert Choice software, and the relative importance (weight) of indices were determined using the analytical hierarchy process (AHP). Then, the gave scores of experts were assigned to indices for various conditions. Bridge inspectors can examine the bridge, determine the scores of indices, and compute BCI. Higher values of BCI indicate better conditions. Therefore, bridges with lower BCI take priority in maintenance activities. Five bridges in Iran, Semnan province, were selected as the case studies, and BCI calculation of these bridges was conducted.
ARTICLE | doi:10.20944/preprints201809.0573.v1
Subject: Earth Sciences, Other Keywords: crowdsourcing; citizen science; agriculture; street-view; in-situ; LUCAS; Copernicus
Online: 28 September 2018 (16:30:41 CEST)
New approaches to collect in-situ data are needed to complement the high spatial (10~m) and temporal (5-day) resolution of Copernicus Sentinel satellite observations. Making sense of Sentinel observations requires high quality and timely in-situ data for training and validation. Classical ground truth collection is expensive, lacks scale, fails to exploit opportunities for automation, and is prone to sampling error. Here we evaluate the potential contribution of opportunistically exploiting crowd-sourced street-level imagery to collect massive high-quality in-situ data in the context of crop monitoring. This study assesses this potential by answering two questions: 1) what is the spatial availability of these images across the European Union (EU)? and 2) can these images be transformed to useful data? To answer the first question, we evaluated the EU availability of street-level images on Mapillary - the largest open-access platform for such images - against the Land Use and land Cover Area frame Survey (LUCAS) 2018, a systematic surveyed sampling of 337031 points. For 37.78% of the LUCAS points a crowd-sourced image is available within a 2-km buffer, with a mean distance of 816.11 m. We estimate that 9.44% of the EU territory has a crowd-sourced image within 300-m from a LUCAS point, illustrating the huge potential of crowd-sourcing as a complementary sampling tool. After artificial and built up (63.14%), and inland water (43.67%) land cover classes, arable land has the highest availability at 40.78%. To answer the second question, we focus on identifying crops at parcel level using all 13.6 million Mapillary images collected in the Netherlands. Only 1.9% of the contributors generated 75.15% of the images. A procedure was developed to select and harvest the pictures potentially best suited to identify crops using the geometries of 785710 Dutch parcels and the pictures' meta-data such as camera orientation and focal length. Availability of crowd-sourced imagery looking at parcels was assessed for 8 different crop groups with the 2017 parcel level declarations. Parcel revisits during the growing season allowed to track crop growth. Examples illustrate the capacity to recognize crops and their phenological development on crowd-sourced street-level imagery. Consecutive images taken during the same capture track allow selecting the image with the best unobstructed view. In the future, dedicated crop capture tasks can improve image quality and expand coverage in rural areas.
Subject: Engineering, Civil Engineering Keywords: Energy consumption monitoring system; Building energy conservation management; Insect Intelligent Building technology; Computing process node; Insect intelligent algorithm
Online: 4 September 2019 (14:27:48 CEST)
In this paper, the methodology using Insect Intelligent Building (I^2B) technology for establishing energy consumption monitoring system of public buildings is prevailed. The computing process node and distributed algorithm are utilized to implement the energy consumption collection and data transmission and data pre-processing. Taking a commercial building as a case study, CPNs are applied to set up the building energy consumption monitoring system, with the Spanning Tree Algorithm for generating network topology，and BPNN method for solving abnormal data and recovering missing data. The research results demonstrate the proposed method can effectively improve the performance of plug-and-play and self-identified and self-configuration of energy consumption monitoring system.
ARTICLE | doi:10.20944/preprints201701.0035.v1
Subject: Biology, Other Keywords: QuantiFERON-TB Gold In-Tube test; pulmonary tuberculosis; sensitivity; specificity
Online: 9 January 2017 (03:58:00 CET)
The value of QuantiFERON in the diagnosis of tuberculosis and in the monitoring of the response to anti-tuberculosis treatment is unclear. The aims of this study were to evaluate the accuracy of the QuantiFERON-TB Gold In-Tube (QFT-GIT) test in the diagnosis of tuberculosis and in the monitoring of the response to anti-tuberculosis treatment in patients with active pulmonary tuberculosis (PTB). Between January 2013 and December 2015, 128 cases with active PTB and 128 controls with no mycobacterial infection, matched by age (within 3 years) and by the week that they visited Tainan Chest Hospital, were enrolled in the study. Serial testing by QFT-GIT at baseline and after 2 and 6 months of treatment was performed. At these time points, a comparison of the performance of QFT-GIT with that of sputum culture status among study subjects was conducted. Compared to baseline, 116 (87.2%) cases showed a decreased response, whereas 17 (12.8%) showed persistent or stronger interferon-gamma (IFN-γ) responses at 2 months. Their IFN-γ responses declined significantly from baseline to 2 months (median, 6.32 vs. 4.12; P < 0.005). The sensitivity values of the QFT-GIT test for the detection of pulmonary tuberculosis at cut-off points of 0.35 IU/ml, 0.20 IU/ml, and 0.10 IU/ml were 74.4%, 78.2%, and 80.5%, respectively. The specificity values at cut-off points of 0.35 IU/ml, 0.20 IU/ml, and 0.10 IU/ml were 66.2%, 63.9%, and 57.1%, respectively. Our results support the QFT-GIT assay as a potential tool for diagnosing tuberculosis and for monitoring the efficacy of anti-tuberculosis treatment.
ARTICLE | doi:10.20944/preprints202011.0492.v1
Subject: Materials Science, Biomaterials Keywords: 2D fatty liver in vitro model; Blu-Ray disc; Plasmonic Nanomaterials; Label-Free Biosensing
Online: 19 November 2020 (07:30:22 CET)
Non-alcoholic fatty liver (NAFLD) is a metabolic disorder related with a chronic lipid accumulation within the hepatocytes. This disease is the most common liver disorder worldwide and it is estimated that is present in up to 25% of the world's population. However, the real prevalence of this disease and the associated disorders is unknown mainly because reliable and applicable diagnostic tools are lacking. It is known that the level of albumin, a pleiotropic protein synthetized by hepatocytes, is correlated with the correct function of the liver. The development of a complementary tool that allow the direct, sensitive, and label-free monitoring of albumin secretion in hepatocyte cell culture can provide insight about the mechanism and drugs action in NAFLD. With this aim, we have developed a simple integrated plasmonic biosensor based on gold nanogratings from periodic nanostructures present in commercial Blu-ray optical disc. This sensor allows the direct and label-free monitoring of albumin in a 2D fatty liver disease model under flow conditions using highly specific polyclonal antibody. This technology avoids both the amplification and blocking steps showing a limit of detection within pM range (≈ 0.39 ng/mL). Thanks to this technology, we identified the optimal fetal bovine serum (FBS) concentration to maximize the lipid accumulation within the cells. Moreover, we discovered that at third day from lipids challenge, the hepatocytes increased the amount of albumin secreted. These data demonstrate the ability of hepatocytes to respond to the lipid stimulation releasing more albumin. Further investigation needed to unveil the biological significance of that cell behaviour.
ARTICLE | doi:10.20944/preprints201810.0143.v2
Subject: Mathematics & Computer Science, Information Technology & Data Management Keywords: Industry 4.0; XaaS; SemSOA; business process optimization; scalable cloud service deployment; process service plan just-in-time adaptation; BPMN partial fault tolerance
Online: 22 November 2018 (05:29:31 CET)
A new requirement for the manufacturing companies in Industry 4.0 is to be flexible with respect to changes in demands, requiring them to react rapidly and efficiently on the production capacities. Together with the trend to use Service-Oriented Architectures (SOA), this requirement induces a need for agile collaboration among supply chain partners, but also between different divisions or branches of the same company. In order to address this collaboration challenge, we~propose a novel pragmatic approach for the process analysis, implementation and execution. This~is achieved through sets of semantic annotations of business process models encoded into BPMN 2.0 extensions. Building blocks for such manufacturing processes are the individual available services, which are also semantically annotated according to the Everything-as-a-Service (XaaS) principles and stored into a common marketplace. The optimization of such manufacturing processes combines pattern-based semantic composition of services with their non-functional aspects. This is achieved by means of Quality-of-Service (QoS)-based Constraint Optimization Problem (COP) solving, resulting in an automatic implementation of service-based manufacturing processes. The produced solution is mapped back to the BPMN 2.0 standard formalism by means of the introduced extension elements, fully detailing the enactable optimal process service plan produced. This approach allows enacting a process instance, using just-in-time service leasing, allocation of resources and dynamic replanning in the case of failures. This proposition provides the best compromise between external visibility, control and flexibility. In this way, it provides an optimal approach for business process models' implementation, with a full service-oriented taste, by implementing user-defined QoS metrics, just-in-time execution and basic dynamic repairing capabilities. This paper presents the described approach and the technical architecture and depicts one initial industrial application in the manufacturing domain of aluminum forging for bicycle hull body forming, where the advantages stemming from the main capabilities of this approach are sketched.
ARTICLE | doi:10.20944/preprints202106.0459.v1
Subject: Engineering, Automotive Engineering Keywords: Autonomous Driving System; In-Car Gaming; Driver Behavior; Driving Related Tasks; 3D-VR/AR
Online: 17 June 2021 (12:29:00 CEST)
As Automated Driving Systems (ADS) technology gets assimilated into the market, the driver’s obligation will be changed to a supervisory role. A key point to consider is the driver’s engagement in the secondary task to maintain the driver/user in the control loop. The paper’s objective is to monitor driver engagement with a game and identify any impacts the task has on hazard recognition. We designed a driving simulation using Unity3D and incorporated three tasks: No-task, AR-Video, and AR-Game tasks. The driver engaged in an AR object interception game while monitoring the road for threatening road scenarios. From the results, there was less than 1 second difference between the means of gaming task (mean = 2.55s, std = 0.1002s) to no-task (mean = 2.55s, std = 0.1002s). Game scoring followed three profiles/phases: learning, saturation, and decline profile. From the profiles, it is possible to quantify/infer drivers’ engagement with the game task. The paper proposes alternative monitoring that has utility, i.e., entertaining the user. Further experiments AR-Game focusing on real-world car environment will be performed to confirm the performance following the recommendations derived from the current test.
ARTICLE | doi:10.20944/preprints201702.0050.v1
Subject: Engineering, Electrical & Electronic Engineering Keywords: Digital Lock-in Amplifier (DLIA); Field Programmable Gate Array (FPGA); Near Infrared Spectroscopy (NIRS); Hardware Description Language (HDL); Light Emitting Diode (LED); Silicon Photomultiplier (SiPM); Microprocessors
Online: 14 February 2017 (09:11:38 CET)
Functional Near Infrared Spectroscopy (fNIRS) systems for e-health applications usually suffer of poor signal detection mainly due to a low end-to-end signal to noise ratio of the electronics chain. Lock-In Amplifiers (LIA) historically represent a powerful technique helping to improve performances in such circumstances. In this work it has been designed and implemented a digital LIA system, based on a Zynq® Field Programmable Gate Array (FPGA), trying to explore if this technique might improve fNIRS system performances. More broadly, FPGA based solution flexibility has been investigated, with particular emphasis applied to digital filter parameters, needed in the digital LIA, and it has been evaluated its impact on the final signal detection and noise rejection capability. The realized architecture was a mixed solution between VHDL hardware modules and software ones, running within a softcore microprocessor. Experimental results have shown the goodness of the proposed solutions and comparative details among different implementation will be detailed. Finally a key aspect taken into account throughout the design was its modularity, allowing an ease increase of the input channels while avoiding the growth of the design cost of the electronics system.
ARTICLE | doi:10.20944/preprints202212.0488.v1
Subject: Mathematics & Computer Science, Probability And Statistics Keywords: Opioid crisis; PDMP; Pill Mill; Difference-in-Difference; Policy Analysis; Pharmaceutical Supply Chain
Online: 26 December 2022 (11:10:57 CET)
The opioid crisis in the United States has had devastating effects on communities across the country, leading many states to pass legislation that limits the prescription of opioid medications in an effort to reduce the number of overdose deaths. This study evaluates the impact of two categories of PDMP and Pill Mill regulations on the supply of opioid prescriptions at the level of dispensers and distributors (excluding manufacturers) using ARCOS data. The study uses a difference-in-difference method with a two-way fixed design to analyze the data. The study finds that both of the regulations are associated with reductions in the volume of opioid distribution. However, the study reveals that these regulations may have unintended consequences, such as shifting the distribution of controlled substances to neighboring states. For example, in Tennessee, the implementation of Operational PDMP regulations reduces in-state distribution of opioid drugs by 3.36% (95% CI, 2.37 to 4.3), while the out-of-state distribution to Georgia, which did not have effective PDMP regulations in place, increases by 16.93% (95% CI, 16.42 to 17.44). Our studies emphasize that policymakers should consider the potential for unintended distribution shifts of opioid drugs to neighboring states with laxer regulations as well as varying impacts on different dispenser types.
ARTICLE | doi:10.20944/preprints201911.0012.v4
Subject: Materials Science, Polymers & Plastics Keywords: interfacial polymerization; in-situ FT-IR spectroscopy; thin-film composite membrane; nanofiltration membrane
Online: 8 January 2020 (09:04:29 CET)
The interfacial polymerization (IP) of piperazine (PIP) and trimesoyl chloride (TMC) has been extensively utilized to synthesize nanofiltration (NF) membranes. However, it is still a huge challenge to monitor the IP reaction, because of the fast reaction rate and the formed ultra-thin film. Herein, two effective strategies were applied to reduce the IP reaction rate: (1) the introduction of hydrophilic interlayers between the porous substrate and the formed polyamide layer, and (2) the addition of macromolecular additives in the aqueous solution of PIP. As a result, in situ Fourier transform infrared (FT-IR) spectroscopy was firstly used to monitor the IP reaction of PIP/TMC with hydrophilic interlayers or macromolecular additives in the aqueous solution of PIP. Moreover, the formed polyamide layer growth on the substrate was studied in a real-time manner. The in situ FTIR experimental results confirmed that the IP reaction rates were effectively suppressed and that the formed polyamide thickness was reduced from 138 ± 24 nm to 46 ± 2 nm according to TEM observation. Furthermore, an optimized NF membrane with excellent performance was consequently obtained, which included boosted water permeation of about 141–238 (L/m2·h·MPa) and superior salt rejection of Na2SO4 > 98.4%.
ARTICLE | doi:10.20944/preprints202005.0007.v2
Subject: Mathematics & Computer Science, Artificial Intelligence & Robotics Keywords: Process Mining; Business Processes; Natural Language Processing; Machine Learning
Online: 3 September 2021 (10:25:25 CEST)
Communication is indispensable for today's lifestyle, and thanks to technology, millions of people can communicate as quickly as possible. The effect of this breakthrough has transformed organizations to the degree that they generate billions of emails daily to facilitate their operations. There is implicit information behind this vast corpus of human-generated content that can be mined and used for their benefit. This paper tries to address the opportunity that email logs can bring to organizations and propose an approach to discover process models by combining supervised text classification and process mining. This framework consists of two main steps, text classification, and process mining. First, Emails will be classified with supervised machine learning, and to mine, the processes fuzzy Miner is used. To further investigate the application of this framework, we also applied this framework over a real-life dataset from a case study organization.
CASE REPORT | doi:10.20944/preprints202107.0340.v1
Subject: Medicine & Pharmacology, Allergology Keywords: cerebral blood flow and oxygenation; diffuse correlation spectroscopy; EEG; traumatic brain in-jury; neurointensive care unit; neuromonitoring
Online: 14 July 2021 (15:19:40 CEST)
Survivors of severe brain injury may require care in a neurointensive care unit (neuro-ICU), where the brain is vulnerable to secondary brain injury. Thus, there is a need for noninvasive, bedside, continuous cerebral blood flow monitoring approaches in the neuro-ICU. Our goal is to address this need through combined measurements of EEG and functional optical spectroscopy (EEG-Optical) instrumentation and analysis to provide a complementary fusion of data about brain activity and function. The present case demonstrates in a patient with traumatic brain injury, noninvasive cerebral blood flow transients can be recorded that correlate with gold-standard invasive measurements and with the frequency content changes in the EEG data during clinical care.
ARTICLE | doi:10.20944/preprints201912.0227.v1
Subject: Keywords: information; process; process algebra; causal tapestry; process tapestry; transience; contextuality; emergence
Online: 17 December 2019 (10:07:58 CET)
The Process Algebra model has been shown to provide an alternative mathematical framework for non-relativistic quantum mechanics (NRQM). It reproduces the wave functions of non-relativistic quantum mechanics to a high degree of accuracy. It posits a fundamental level of finite, discrete events upon which the usual entities of NRQM supervene. It has been suggested that the Process Algebra model provides a true completion of NRQM, free of divergences and paradoxes, with causally local information propagation, contextuality and realism. Arguments in support of these claims have been mathematical. Missing has been an ontology of this fundamental level from which the formalism naturally emerges. In this paper it is argued that information and information flow provides this ontology. Higher level constructs such as energy, momentum, mass, spacetime, are all emergent from this fundamental level.
ARTICLE | doi:10.20944/preprints201707.0050.v1
Subject: Engineering, Control & Systems Engineering Keywords: ventilation process prediction; safety of mine ventilation system; sensitivity of the main air flows in ventilation subnetworks
Online: 18 July 2017 (12:27:46 CEST)
This paper presents a methodology for determining the sensitivity of the main air flow directions in ventilation subnetworks to changes of aerodynamic resistance and of air density in mine workings. Formulae for determination of the sensitivity of the main subnetwork air flows by establishing the degree of dependency of the air volume stream in a given working on the variations in resistance or air density of other workings of the network have been developed. They have been implemented in the VentGraph mine ventilation network simulator. This software, widely used in Polish collieries provides an extended possibility to predict the process of ventilation, air distribution and, in the case of underground fire, also the spread of combustion gasses. The new method facilitates assessment by mine ventilation services of the stability of ventilation systems in exploitation areas and determine of the sensitivity of the main subnetwork air flow directions to changes of aerodynamic resistance and air density. Recently in some Polish collieries new longwalls are developed in seams located deeper then the bottom of the intake shaft. Such solution is called “exploitation below the level of access” or “sublevel”. The new approach may be applied to such developments to assess the potential of changes of direction and air flow rates. In addition, interpretation of the developed sensitivity indicator is presented. While analyzing air distributions for sublevel exploitation, application of current numerical models for calculations of the distribution results in tangible benefits, such as the evaluation of the safety or risk levels for such exploitation. Application of the VentGraph computer program, and particularly the module POŻAR (fire) with the newly developed options, enables an additional approach to the sensitivity indicator in evaluating air flow safety levels for the risks present during exploitation below the level of the intake shaft. The analyses performed and examples presented enabled useful conclusions in mining practice to be drawn.
ARTICLE | doi:10.20944/preprints202009.0737.v1
Subject: Mathematics & Computer Science, Algebra & Number Theory Keywords: process trees; Petri nets; workflow nets; process mining
Online: 30 September 2020 (10:37:25 CEST)
Since their introduction, process trees have been frequently used as a process modeling formalism in many process mining algorithms. A process tree is a (mathematical) tree-based model of a process, in which internal vertices represent behavioral control-flow relations and leaves represent process activities. Translation of a process tree into a sound Workflow net is trivial; however, the reverse is not the case. Simultaneously, an algorithm that translates a WF-net into a process tree is of great interest, e.g., the explicit knowledge of the control-flow hierarchy in a WF-net allows one to reason on its behavior more easily. Hence, in this paper, we present such an algorithm, i.e., it detects whether a WF-net corresponds to a process tree, and, if so, constructs it. We prove that, when the algorithm finds a process tree, the language of the process tree is equal to the language of the original WF-net. The experiments conducted show that the algorithm’s corresponding implementation has a quadratic time complexity in the size of the WF-net. Furthermore, the experiments show strong evidence of process tree re-discoverability.
ARTICLE | doi:10.20944/preprints202002.0122.v1
Subject: Mathematics & Computer Science, Information Technology & Data Management Keywords: structured information control net; process mining; process analyzing; structural analysis; behavioral analysis; process rediscovery
Online: 10 February 2020 (09:37:56 CET)
Process (or business process) management systems fulfill defining, executing, monitoring and managing process models deployed on process-aware enterprises. Accordingly, the functional formation of the systems is made up of three subsystems such as modeling subsystem, enacting subsystem and mining subsystem. In recent times, the mining subsystem has been becoming an essential subsystem. Many enterprises have successfully completed the introduction and application of the process automation technology through the modeling subsystem and the enacting subsystem. According as the time has come to the phase of redesigning and reengineering the deployed process models, from now on it is important for the mining subsystem to cooperate with the analyzing subsystem; the essential cooperation capability is to provide seamless integrations between the designing works with the modeling subsystem and the redesigning work with the mining subsystem. In other words, we need to seamlessly integrate the discovery functionality of the mining subsystem and the analyzing functionality of the modeling subsystem. This integrated approach might be suitable very well when those deployed process models discovered by the mining subsystem are complex and very large-scaled, in particular. In this paper, we propose an integrated approach for seamlessly as well as effectively providing the mining and the analyzing functionalities to the redesigning work on very large-scale and massively parallel process models that are discovered from their enactment event logs. The integrated approach especially aims at analyzing not only their structural complexity and correctness but also their animation-based behavioral properness, and becomes concretized to a sophisticated analyzer. The core function of the analyzer is to discover a very large-scale and massively parallel process model from a process log dataset and to validate the structural complexity and the syntactical and behavioral properness of the discovered process model. Finally, this paper writes up the detailed description of the system architecture with its functional integration of process mining and process analyzing. And more precisely, we excogitate a series of functional algorithms for extracting the structural constructs as well as for visualizing the behavioral properness on those discovered very large-scale and massively parallel process models. As experimental validation, we apply the proposed approach and analyzer to a couple of process enactment event log datasets available on the website of the 4TU.Centre for Research Data.
ARTICLE | doi:10.20944/preprints202206.0405.v1
Subject: Engineering, Industrial & Manufacturing Engineering Keywords: management methods; management; system; technologies; production process; process improvement
Online: 29 June 2022 (10:16:24 CEST)
The contribution deals with the selection of progressive methods and technologies to improve the management of production processes in the industrial area. The first chapter of the contribution summarises the theoretical starting points for the management of production processes and their progressive concepts. Chapter 2 deals with the analysis of the current state of play in the use of methods and technologies in production management and summarise its problem areas. Chapter 3 presents the design of a system for selecting the appropriate method and technology to improve the management of the production process. The system presented is designed from data collection blocks, a benchmark for the performance of industrial enterprises' production processes, and data mining to gain knowledge of the impact of the methods and technologies used on the performance of production processes. The last block of the system allows the presentation of the obtained results, which sets out the recommendation for the choice of progressive method and technology. The system is partially verified in the last chapter of the contribution.
ARTICLE | doi:10.20944/preprints201809.0073.v1
Subject: Mathematics & Computer Science, Information Technology & Data Management Keywords: Software Process Analysis, Software Process Improvement, Data Prove-nance
Online: 4 September 2018 (16:30:51 CEST)
Companies have been increasing the amount of data that they collect from their systems and processes, considering the decrease in the cost of memory and storage technologies in recent years. The emergence of technologies such as Big Data, Cloud Computing, E-Science, and the growing complexity of information systems made evident that traceability and provenance are promising approaches. Provenance has been successfully used in complex domains, like health sciences, chemical industries, and scientific computing, considering that these areas require a comprehensive semantic traceability mechanism. Based on these, we investigate the use of provenance in the context of Software Process (SP) and introduce a novel approach based on provenance concepts to model and represent SP data. It addresses SP provenance data capturing, storing, new information inferencing and visualization. The main contribution of our approach is PROV-SwProcess, a provenance model to deal with the specificities of SP and its ability in supporting process managers to deal with vast amounts of execution data during the process analysis and data-driven decision-making. A set of analysis possibilities were derived from this model, using SP goals and questions. A case study was conducted in collaboration with a software development company to instantiate the PROV-SwProcess model (using the proposed approach) with real-word process data. This study showed that 87.5% of the analysis possibilities using real data was correct and can assist in decision-making, while 62.5% of them are not possible to be performed by the process manager using his currently dashboard or process management tool.
ARTICLE | doi:10.20944/preprints202201.0009.v1
Subject: Engineering, Energy & Fuel Technology Keywords: OLP; Deoxygenation; Absorber columns; Process flowsheet; Process Simulation; Aspen-HYSYS.
Online: 4 January 2022 (14:54:24 CET)
In this work, the deoxygenation of organic liquid products (OLP) obtained by thermal catalytic cracking of palm oil at 450 °C, 1.0 atmosphere, with 10% (wt.) Na2CO3 as catalyst, in multistage countercurrent absorber columns using supercritical carbon dioxide (SC-CO2) as solvent, with Aspen-HYSYS process simulator was systematically investigated. In a previous study, the thermodynamic data basis and EOS modeling necessary to simulate the deoxygenation of OLP has been presented [Molecules 2021, 26, 4382. https://doi.org/10.3390/molecules26144382]. This work address a new flowsheet, consisting of 03 absorber columns, 10 expansions valves, 10 flash drums, 08 heat exchanges, 01 pressure pump, and 02 make-up of CO2, aiming to improve the deacidification of OLP. The simulation was performed at 333 K, 140 bar, and (S/F) = 17; 350 K, 140 bar, and (S/F) = 38; 333 K, 140 bar, and (S/F) = 25. The simulation shows that 81.49% of OLP could be recovered and the concentrations of hydrocarbons in the extracts of absorber-01 and absorber-02 were 96.95 and 92.78% (wt.) in solvent-free basis, while the bottom stream of absorber-03 was enriched in oxygenates compounds with concentrations up to 32.66% (wt.) in solvent-free basis, showing that organic liquid products (OLP) was deacidified and SC-CO2 was able to deacidify OLP and to obtain fractions with lower olefins content. The best deacidifying conditions was obtained at 333 K, 140 bar, and (S/F) = 17.
ARTICLE | doi:10.20944/preprints201808.0360.v1
Subject: Engineering, Mechanical Engineering Keywords: manufacturing sustainability; milling process; turning process; energy consumption; power consumption.
Online: 20 August 2018 (14:52:43 CEST)
The system design for Sustainable manufacturing is to consider both ecological and financial constraints. Manufacturing industry demands to advance in sustainable manufacturing by accounting in environment factors in it. All the factors that affect the environment need to be analyzed so that proper amendments or suggestions can be provided. To favour this, Computer aided life cycle inventory model has been presented for CNC milling and turning processes. Based on utilization of resources and stages, whole machining operation time can be divided into three phases named as process (milling or turning), idle and basic times. As parameters are different for evaluating the process times i.e. depth and width of cut in case of milling and initial and final diameters for turning, two different case studies has been presented one for each milling and turning process. Effect of material choice has been studied for different processes. Highly dense and hard materials takes more time in finishing the job due to low cutting speed and feed rates as compared to that of sot materials. In addition, face milling takes more time and consumes more power as compared to peripheral milling due to more retraction time caused by over travel distance and lower vertical transverse speeds than the horizontal transverse speed used in peripheral retraction process.
Subject: Materials Science, General Materials Science Keywords: Semisolid casting; RheoMetal process; GISS process; SEED process; production; capability; surface treatment; heat treatment; tool-life; productivity
Online: 19 September 2020 (05:15:22 CEST)
Semisolid casting of Aluminum alloys is growing. For magnesium alloys, Thixomoulding became the dominant process around the world. For aluminium processing, the situation is different as semisolid processing of aluminium is more technically challenging than for magnesium. Today three processes are leading the process implementation, The GISS method, the RheoMetal process and the SEED process. These processes have all strengths and weaknesses and will fit a particular range of applications. The current paper aims at looking at the strengths and weaknesses of the processes to identify product types and niche applications for each process based on current applications and development directions taken for these processes
Subject: Mathematics & Computer Science, Probability And Statistics Keywords: contagion risk; insurance premium; aggregate claims; default-free bond pricing; self-exciting process; Hawkes process; CIR process
Online: 28 August 2019 (04:00:41 CEST)
In this paper, we study a generalised CIR process with externally-exciting and self-exciting jumps, and focus on the distributional properties and applications of this process and its aggregated process. The first and second moments of this jump-diffusion process are used to calculate the insurance premium based on mean-variance principle. The Laplace transform of the aggregated process is derived, and this leads to an application for pricing default-free bonds which could capture the impacts of both exogenous and endogenous shocks. Illustrative numerical examples and comparisons with other models are also provided.
ARTICLE | doi:10.20944/preprints201709.0003.v1
Subject: Keywords: business workflows; discrete event systems; event logs; configurable process models; configurable process trees; process mining; business processes
Online: 1 September 2017 (17:14:50 CEST)
Configurable process models are frequently used to represent business workflows and other discrete event systems among different branches of large organizations: they unify commonalities shared by all branches and describe their differences, at the same time. The configuration of such models is usually done manually, which is challenging. On the one hand, when the number of configurable nodes in the configurable process model grows, the size of the search space increases exponentially. On the other hand, the person performing the configuration may lack the holistic perspective to make the right choice for all configurable nodes at the same time, since choices influence each other. Nowadays, information systems that support the execution of business processes create event data reflecting how processes are performed. In this article, we propose three strategies (based on exhaustive search, genetic algorithms, and greedy heuristic) that use event data to automatically derive a process model from a configurable process model that better represents the characteristics of the process in a specific branch. These strategies have been implemented in our proposed framework, and tested in both business-like event logs as recorded in a higher educational ERP system, and a real case scenario involving a set of Dutch municipalities.
ARTICLE | doi:10.20944/preprints202207.0153.v2
Subject: Engineering, Industrial & Manufacturing Engineering Keywords: AHP process; Saaty method; Eigenvalue; Excel program; managerial decision-making process
Online: 28 July 2022 (08:49:42 CEST)
This article describes the Analytic Hierarchy Process (AHP) method which can be calculated by several methods. The AHP method is essential for the managerial decision-making to recognize which method is efficient for the calculation and to determine the proper order of criteria. In the article there are included methods that can be used in order to calculate the matrix in the AHP process for setting criteria such as Geometric mean, Arithmetic mean, Row sum of the adjusted the Saaty matrix, Reverse sums of the Saaty matrix columns and Row sums of the Saaty matrix. The paper is focused on the accuracy of the methods used. The results show that the most accurate method is the Saaty method, and the second most accurate method is the geometric mean in order to determine the ranking. This method is easier to use when the same order is achieved as in the Saaty method, followed by Geometric mean, which is favourable for fast and easy determination of alternatives using in the AHP process. The survey carried out among managers graduated from Universities and Colleges in Slovakia showed that the respondents considered the Saaty method as the most complex and also the most difficult method and the geometric mean average method the simplest method. 44% of respondents stated that they are capable to use a program to calculate the AHP. 46% of respondents said they had experience with some method related to the managerial decision-making process. 72% of managers regarded as important to manage some method for decision-making in their managerial position.
CONCEPT PAPER | doi:10.20944/preprints202106.0235.v1
Subject: Social Sciences, Accounting Keywords: evaluation; proximal outcomes; distal outcomes; process research; training process; postgraduate program
Online: 8 June 2021 (13:45:14 CEST)
This contribution illustrates the training evaluation system developed within the Master’s Program in Family and Community Mediation at Università Cattolica del Sacro Cuore in Milan. This is an interim evaluation, which focuses on the training process and which considers fundamental the collaboration with the subjects of the training. The peculiarity of this work concerns the possibility of inserting research within the training process, following a logic of mutual enrichment both in terms of content and learning. The contribution illustrates in detail the outcome and the process evaluation system, defining the perspective, the objectives, and the methodology of implementation. In particular, the outcome evaluation focuses on the distal and proximal outcomes of the training, while the process analysis focuses on the dynamics within the group of participants. Although further evaluations involving different training groups and other training processes are needed, this training evaluation system allows to shed light on both the topic and the context in which training is delivered. The integration between different points of view and several levels of analysis allows the researchers to deepen the individual path of each participant as well as to have feedbacks on the progress of the training group as a whole.
Subject: Engineering, Biomedical & Chemical Engineering Keywords: process design; sustainable development; chemical industry; process industry; megatrends; design tools
Online: 5 January 2021 (11:32:26 CET)
This paper describes the state of the art and future opportunities for process design and sustainable development. In the Introduction the main global megatrends and European Union's response to two of them, the European Green Deal, are presented. The organization of professionals in the field, their conferences and publications support the two topics. A brief analysis of the published documents in the two most popular databases shows that the environmental dimension predominates, followed by an economic one, while the social pillar of sustainable development is undervalued. The main design tools for sustainability are described. As an important practical case, the European chemical and process industries are analyzed and their achievements in sustainable development are highlighted; in particular, their strategies are presented in more detail. The conclusions cover the most urgent future development areas of the process industries, carbon capture with utilization or storage, process analysis, simulation, synthesis and optimization tools; zero waste, circular economy and resource efficiency already play an important role. However, more profound changes are needed in the coming decades, including a shift away from growth with changes in habits, lifestyles and business models. Lifelong education for sustainable development will play a very important role in the growth of democracy and happiness instead of consumerism and neoliberalism.
ARTICLE | doi:10.20944/preprints201809.0156.v2
Subject: Medicine & Pharmacology, Pediatrics Keywords: cancer; delay in referral; children; delay in diagnosis; delay in treatment
Online: 10 May 2020 (18:21:17 CEST)
Background: Cancer has been one of the most critical health problems during the past decades which require serious consideration, especially in developing countries. Methods: 80 parents of the children with different types of cancers were studied via both questionnaire and person-to-person interview after announcing their informed consent. Results: The highest delay in patients' initial referral was due to the following factors: lack of attention or ignoring the first symptoms, delay in referring to the physician, low economic status and even lack of family support for the patient. In addition, visiting several doctors after the initial diagnosis, uncertainty about the first proposed method, and high cost of treatment can be mentioned as the main causes of delays in the start of treatment. Conclusion: Education plays an important role in identifying the signs of cancer. In addition, proper relationship and cooperation between the health system and physicians as well as provision of adequate information to patients could lead to the long-term cooperation of these patients in continuing their treatment.
ARTICLE | doi:10.20944/preprints202101.0396.v1
Subject: Engineering, Automotive Engineering Keywords: biofuel; biobutanol; ABE-fermentation; Clostridium; continuous reactor; process model; multi stage process
Online: 20 January 2021 (10:59:00 CET)
The production of butanol, acetone and ethanol by Clostridium acetobutylicum is a biphasic fer-mentation process. In the first phase the carbohydrate substrate is metabolized to acetic and bu-tyric acid, in the following second phase the product spectrum is shifted towards the economi-cally interesting solvents. Here we present a cascade of six continuous stirred tank reactors (CCSTR), which allows performing the time dependent metabolic phases of an ace-tone-butanol-ethanol (ABE) batch fermentation in a spatial domain. Experimental data of steady states under four operating conditions - with variations of the pH in the first bioreactor between 4.3 and 5.6 as well as the total dilution rate between 0.042 1/h and 0.092 1/h - were used to optimize and validate a corresponding mathematical model. Beyond a residence time distribution representation and substrate, biomass and product kinetics this model also includes the differen-tiation of cells between the metabolic states. Model simulations predict a final butanol product concentration of 8.2 g/L and a butanol productivity of 0.75 g/(L h) in the CCSTR operated at a pH in bioreactor 1 of 4.3 and D = 0.092 1/h, while 31 % of the cells are differentiated to the solventogenic state. Aiming at an enrichment of solvent-producing cells, a feedback loop was introduced into the cascade - sending cells from a later state of the process (bioreactor 4) back to an early stage of the process (bioreactor 2). In agreement with the experimental observations, the model accurately predicted an increase of butanol formation rate in bioreactor stages 2 and 3, resulting in an overall butanol productivity of 0.76 g/(L h) for the feedback loop cascade. The here presented CCSTR and the validated model will serve to investigate further ABE fermentation strategies for a controlled metabolic switch.
ARTICLE | doi:10.20944/preprints201911.0267.v1
Subject: Behavioral Sciences, Other Keywords: psychological wellbeing; gifted teenagers; giftedness in math; giftedness in humanities; giftedness in sports
Online: 24 November 2019 (04:47:02 CET)
Current article presents the study of psychological wellbeing of adolescents (n=168, age 15-17) gifted in math, humanities and sports and educated in advanced programs for gifted children. Theoretical framework of this study is eudemonic concept of psychological wellbeing by C. Ryff. Psychological wellbeing is measured with Ryff wellbeing scales in Russian adaptation by L.V. Zhukovskaya and E.G. Troshikhina. The study is aimed at understanding differences in psychological wellbeing of gifted teenagers connected to gender and type of giftedness. The results suggest that general wellbeing score did not differ for adolescents with different types of giftedness or of different genders. Separate components of psychological wellbeing, such as purpose in life and self-acceptance, are influenced by activity connected to the talent. Gender differences are subjected to age-specific trends of personal development in adolescence. Type of giftedness might reinforce these trends.
ARTICLE | doi:10.20944/preprints202211.0014.v1
Subject: Mathematics & Computer Science, Artificial Intelligence & Robotics Keywords: atmospheric compensation; Gaussian process; hyperspectral
Online: 1 November 2022 (03:47:57 CET)
Atmospheric correction is the processes of converting radiance measured at a spectral 1 sensor to the reflectance of the materials in a multispectral or hyperspectral image. This is an 2 important step for detecting or identifying the materials present in the pixel spectra. We present 3 two machine learning models for atmospheric correction trained and tested on 100,000 batches of 40 4 reflectance spectra converted to radiance using MODTRAN, so the machine learning model learns 5 the radiative transfer physics from MODTRAN. We created a theoretically interpretable Bayesian 6 Gaussian process model and a deep learning autoencoder treating the atmosphere as noise. We 7 compare both methods for estimating gain in the correction model to the well-know QUAC method 8 of assuming a constant mean endmember reflectance. Prediction of reflectance using the Gaussian 9 process model outperforms the other methods in terms of both accuracy and reliability.
ARTICLE | doi:10.20944/preprints202002.0191.v1
Online: 14 February 2020 (09:24:03 CET)
There is a lack of research based on in-depth theoretical and scientific knowledge to understand the visually impaired, and there has been little effort in the application of strategies for early intervention to minimize risk these people might encounter during development.. This study used semi-structured interviews from eight persons with visual impairments who had various experiences with resiliency. Three resilience processes based on life experiences were identified: 1) Experience and Adaptation: “self-awareness of disability” and “adaptation disability and the environment”; 2) Facing the Circumstances: “the exposure to concealment and abuse,” “the suppression of potential,” “denial and abandonment by family,” “poverty and disability,” “exchange and self-regulation,” and “social integration” themes; and 3) the Positive Reinforcement: “self-disclosure and jump-starting life,” “maintenance of a positive thinking,” and “socioeconomic independence.” These findings expand the understanding of the factors common to the resilience process experienced by individuals with visual impairment and highlight the importance of psychological support, family, education, and social support.
REVIEW | doi:10.20944/preprints201912.0276.v1
Subject: Engineering, Other Keywords: IoT in Healthcare; IoT in Vehicular Networks; Behaviors and Decision Making, IoT in Learning Environments; IoT in Mining; Io IoT in Energy Systems; IoT in Smart Cities; Sensors; Low Power Networks
Online: 20 December 2019 (12:37:09 CET)
With Internet of Things (IoT) gaining presence throughout different industries a lot of new technologies have been introduced to support this undertaking. Implications on one such technology, wireless systems allowed for the use of different communication methods to achieve the goal of transferring data reliably, with more cost efficiency and over longer distances. Anywhere from a single house with only a few IoT devices such as a smart light bulb or a smart thermostat connected to the network, all the way to a complex system that can control power grids throughout countries, IoT has been becoming a necessity in everyday lives. This paper presents an overview of the devices, systems and wireless technologies used in different IoT architectures (Healthcare, Vehicular Networks, Mining, Learning, Energy, Smart Cities, Behaviors and Decision Making), their upbringings and challenges to this date and some foreseen in the future.
ARTICLE | doi:10.20944/preprints201801.0214.v1
Online: 23 January 2018 (09:46:42 CET)
In an effort to discover an effective and selective antitumour agent, synthesis and anti-cancer potential of 4-(pyridin-4-yl)-6-(thiophen-2-yl)pyrimidin-2(1H)-one (SK-25), which has been reported earlier by us with significant cytotoxicity towards MiaPaCa-2 malignant cells, with an IC50 value of 1.95 μM and was found to instigate apoptosis. In the present study, the antitumour efficacy of SK-25 was investigated on Ehrlich ascites tumour (solid), Sarcoma 180 (solid) tumour and Ehrlich ascites carcinoma. The compound was found to inhibit tumour development by 94.71% in Ehrlich ascites carcinoma (EAC), 59.06% in Ehrlich tumour (ET, solid) and 45.68% in Sarcoma-180 (solid) at 30 mg/kg dose. Additionally, SK-25 was established to be non-toxic at a maximum tolerated dose of 1000 mg/kg in acute oral toxicity in Swiss-albino mice. Computer-based predictions also show that the compounds could have an interesting DMPK profile. The current study provides insight for further investigation of the antitumour potential of the molecule.
ARTICLE | doi:10.20944/preprints201706.0029.v1
Subject: Keywords: teaching and learning ıssues in mathematics; social ıssues in mathematics education; cultural ıssues in mathematics education; political ıssues in mathematics education; technological ıssues in mathematics education
Online: 5 June 2017 (06:13:28 CEST)
In this paper, we discuss major issues of mathematics teaching and learning in Nepal. The issues coming from theories such as social and radical constructivism suggest that teachers are not trained to use such approach in teaching mathematics, and there is a lack of teaching aids and materials and technological tools. The issues related to social aspects are gender issues, language issues, social justice issues, and issues related to the achievement gap. The cultural issues are related to the diversity of language and ethnicity. The issues related to political aspects are equity and access, economic status, pedagogical choice, and professional organizations and unions. The issues related to technology include the technological skills, use of technology, and affordance. Finally, we suggest that all the stakeholders should pay attention to resolving these issues by improving the curriculum, training teachers, resourcing the classroom with locally made and new technological tools.
ARTICLE | doi:10.20944/preprints201807.0628.v1
Subject: Physical Sciences, Applied Physics Keywords: Binding energy in lasers; Latent Binding energy in white light; Harnessing binding energy in sunlight
Online: 31 July 2018 (15:22:46 CEST)
Physics behind collimated highly directional nature of lasers, and factors that keep the seven coloured waves that form white light together during their journey from Sun to Earth, in the face of the natural disruptive forces, is not fully understood. Energy levels were measured, in terms of alterations in induced current and voltage, in beams from a red laser, white LED light and the Sunlight before and during their disruption by diffusers (frosted glass for the lasers) and diffractors (diffraction grating for the white light) using a photovoltaic solar cell panel attached to a digital multimeter. Results show that disruption of the beams results in release of extra energy named as ‘Latent Light Binding’ Energy’. It is hypothesized that the ‘binding’ energy keeps laser waves firmly bound together both end-on and side-on enabling laser beams to travel long distances in collimated manner. Likewise, the 7 coloured waves that constitute white light are kept together, probably side-on, in their journey from the Sun to the Earth. The observation that diffraction of sunbeam is associated with increased power generation provides a new lead to improve harnessing of solar energy, where, currently, the focus is mainly on improving efficiency of photovoltaic cell.
ARTICLE | doi:10.20944/preprints202212.0458.v2
Subject: Life Sciences, Microbiology Keywords: Leishmania major; Sesquiterpene; in silico; in vitro; Flow Cytometry
Online: 26 December 2022 (06:39:44 CET)
Leishmaniasis is a vector-borne parasitic infection caused by the bite of female Phlebotomine sandflies. World Health Organization (WHO) estimates 100,000 cases to be reported annually on a global scale, moreover, 13 million people were infected between 2010 and 2015 worldwide. Treatment of leishmaniasis by conventional synthetic compounds is met by challenges pertaining to adverse effects which call for the discovery of newer anti-leishmanial molecules. This study was performed to evaluate the effect and modes of action of a sesquiterpene alcoholic molecule Farnesol on Leishmania major, the causative agent of Zoonotic Cutaneous leishmaniasis. The cytotoxic effect of Farnesol against L. major promastigotes, amastigotes and macrophages was assessed by MTT test and counting. The IC50 on promastigotes by Farnesol on L. major was evaluated by flow cytometry. In the findings, Promastigotes were reduced at 167 µM/mL & the mean numbers of L. major amastigotes in macrophages were significantly decreased in exposure to Farnesol at 172 µM/ml. In addition, Farnesol demonstrated no cytotoxicity on macrophages as the IC50 value was 945 µM/ml; it induced significant apoptosis dose-dependent on L. major promastigotes. In silico protein-ligand binding analyses indicated the effect of Farnesol in perturbation of the ergosterol synthesis pathway of Leishmania with attributes suggesting inhibition of Lanosterol-α-demethylase, the terminal enzyme of ergosterol synthesis machinery. Findings from flow cytometry reveal the role of farnesol in apoptosis-induced killing in promastigotes. Farnesol was effective at very lower concentrations when compared to Paromomycin. Further studies are crucial to evaluate the therapeutic potentials of Farnesol alone and in combination with other conventional drugs using clinical settings.
REVIEW | doi:10.20944/preprints202211.0161.v1
Subject: Mathematics & Computer Science, Information Technology & Data Management Keywords: High Performance Computing (HPC); big data; High Performance Data Analytics (HPDS); con-vergence; data locality; spark; Hadoop; design patterns; process mapping; in-situ data analysis
Online: 9 November 2022 (01:38:34 CET)
Big data has revolutionised science and technology leading to the transformation of our societies. High Performance Computing (HPC) provides the necessary computational power for big data analysis using artificial intelligence and methods. Traditionally HPC and big data had focused on different problem domains and had grown into two different ecosystems. Efforts have been underway for the last few years on bringing the best of both paradigms into HPC and big converged architectures. Designing HPC and big data converged systems is a hard task requiring careful placement of data, analytics, and other computational tasks such that the desired performance is achieved with the least amount of resources. Energy efficiency has become the biggest hurdle in the realisation of HPC, big data, and converged systems capable of delivering exascale and beyond performance. Data locality is a key parameter of HPDA system design as moving even a byte costs heavily both in time and energy with an increase in the size of the system. Performance in terms of time and energy are the most important factors for users, particularly energy, due to it being the major hurdle in high performance system design and the increasing focus on green energy systems due to environmental sustainability. Data locality is a broad term that encapsulates different aspects including bringing computations to data, minimizing data movement by efficient exploitation of cache hierarchies, reducing intra- and inter-node communications, locality-aware process and thread mapping, and in-situ and in-transit data analysis. This paper provides an extensive review of the cutting-edge on data locality in HPC, big data, and converged systems. We review the literature on data locality in HPC, big data, and converged environments and discuss challenges, opportunities, and future directions. Subsequently, using the knowledge gained from this extensive review, we propose a system architecture for future HPC and big data converged systems. To the best of our knowledge, there is no such review on data locality in converged HPC and big data systems.
ARTICLE | doi:10.20944/preprints202210.0011.v1
Subject: Engineering, Biomedical & Chemical Engineering Keywords: Arm Prosthesis; Versatile Gripping; Design Process
Online: 3 October 2022 (12:59:58 CEST)
One of the biggest challenges in arm prosthesis design is to resemble normal hand functionality. Some of arm prosthesis try to mimic all hand movement, while other try to produce several main hand grip patterns. We design a Versatile Gripping Technology (VGT) from a basic whippletree mechanism to produce main hand grip patterns in arm prosthesis. VGT development is the result of abstraction, negation, and systemic variation process of existing solution. To validate VGT, we test it to produce six pattern of hand grip from Southampton Hand Assessment Procedure. VGT capable to produce lateral, power, tip, extension, and spherical grip. In other hand, VGT including the thumb movement and it only use one simple shape moving part. Thus, VGT is effective and efficient. With this result, we argue that abstraction, negation, and systemic variation process of existing solution can create a novel solution and it is applicable in another problem or domain.
ARTICLE | doi:10.20944/preprints202207.0173.v1
Subject: Engineering, Energy & Fuel Technology Keywords: Cement; Co-process; Waste; Incineration; Landfill
Online: 12 July 2022 (04:32:38 CEST)
Recently, the amount of waste generated has been rapidly increasing, there have been difficulties disposing of waste in Korea. As a solution to this, treating waste using a cement kiln has suggested, but the environmental and economic effects have not been specifically studied. In this study, the effects of alternative resources, and reducing the social costs(Installation and Operation) associated with waste treatment facilities were analyzed. Through a co-processing method, a reduction of approximately 53kg of CO2 can be realized during the production of one ton of cement, and cost savings of about 3,815 milion USD. Another effect is an extension of the expiration date for landfills by 7.55 years.
ARTICLE | doi:10.20944/preprints202206.0216.v1
Subject: Engineering, Industrial & Manufacturing Engineering Keywords: direct nanoimprint; process design; deep learning
Online: 15 June 2022 (08:49:17 CEST)
A hybrid smart process and material design system for nanoimprinting is proposed, which combines with a learning system based on experimental and numerical simulation results. Instead of carrying out extensive learning experiments for various condition, the simulation learning results are partially complimented where the results can be theoretically predicted by numerical simulation. In other word, the lacking data in experimental learning are complimented by simulation-based learning results. Therefore, the prediction of nanoimprint results under various conditions without experimental learning could be realized even for unknown materials. In this study, material and process design for a low-temperature nanoimprint process using glycerol-containing polyvinyl alcohol are demonstrated. Experimental results under limited conditions are learned to investigate optimum Glycerol concentrations and process temperatures. On the other hand, simulation-based learning is used to predict the dependence on press pressure and shape parameters. The prediction results for unknown Glycerol concentrations agreed well with the follow-up experiments.
Subject: Mathematics & Computer Science, Information Technology & Data Management Keywords: fraud audit; process mining; visual analytics
Online: 2 March 2021 (09:19:01 CET)
Among the knowledge areas in which process mining has had an impact, the audit domain is particularly striking. Traditionally, audits seek evidence in a data sample that allows to make inferences about a population. Mistakes are usually committed when generalizing the results and anomalies, therefore, appear in unprocessed sets. However, there are some efforts to address these limitations using process mining-based approaches for fraud detection. To the best of our knowledge, no fraud audit method exists that combines process mining techniques and visual analytics to identify relevant patterns. This paper presents a fraud audit approach based on the combination of process mining techniques and visual analytics. The main advantages are: (i) a method is included that guides the use of the visual capabilities of process mining to detect fraud data patterns during an audit; (ii) the approach can be generalized to any business domain; (iii) well-known process mining techniques are used (Dotted Chart, Trace Alignment, Fuzzy Miner…). The techniques were selected by a group of experts and were extended to enable filtering for contextual analysis, to handle levels of process abstraction, and to facilitate implementation in the area of fraud audits. Based on the proposed approach, we developed a software solution that is currently being used in the financial sector as well as in the telecommunications and hospitality sector. Finally, for demonstration purposes, we present a real hotel management use case in which we detected suspected fraud behaviors, thus validating the effectiveness of the approach.
ARTICLE | doi:10.20944/preprints202102.0581.v1
Subject: Mathematics & Computer Science, Algebra & Number Theory Keywords: Optoklik; Robotic Process Automation; RPA; scripting
Online: 25 February 2021 (13:34:05 CET)
Robotic Process Automation (RPA) is a fast growing field inside Business Process Automation (BPA). RPA deals with software robots that use human application interface to do some repetitive tasks like starting programs, filling the forms, clicking on buttons etc. Some RPA software tools like Blue Prism, Rapise and UiPath became popular during last years but there are still some problems with them starting with their high price and difficulties to adapt to the real use cases. On the other hand, Optoklik is a relatively small RPA tool developed to be highly adaptable to the real cases and still easy to use. This paper presents the main functions of Optoklik and discusses its usage in automation of different actions that use human application interface.
ARTICLE | doi:10.20944/preprints202007.0177.v1
Subject: Engineering, Biomedical & Chemical Engineering Keywords: Chitosan; Process safety; NuDIST; CAPE; Shrimp
Online: 9 July 2020 (07:49:29 CEST)
Nowadays, inherently safer designs are considered as key priorities to prevent or mitigate serious incidents with devastating consequences. The need for process safety assessment during early design phases has motivated the development of several contributions related to computer-aided assessment methodologies in order to measure the inherent safety of chemical processes. In this work, the large-scale production of chitosan from shrimp wastes was evaluated from process safety point of view using the numerical descriptive inherent safety technique (NuDIST).To this end, simulation of the chitosan production was performed using Aspen Plus ® to obtain extended mass and energy balances. The assessment of all the chemicals involved within the process was carried out for the following safety parameters: explosivity (EXP), flammability (FL) and toxicity (TOX). The safety assessment of the process included the parameters of temperature (T), pressure (P) and heat of reaction (HR). The maximum chemical safety score was estimated in 171.01 with ethanol as main contributor to the parameter of explosivity and flammability. The score associated with operating data was calculated in 209.30 and heat of reaction reported to be the most affecting parameter. The NuDIST score was estimated in 380.30. This NuDIST value revealed the low hazards associated with the handling of substances such as shrimp wastes, chitosan and water, as well as the non-extreme temperature and pressure conditions. In general, the large-scale production of chitosan from shrimp shells showed to be an inherently safe alternative of waste valorization.
ARTICLE | doi:10.20944/preprints201810.0373.v1
Subject: Social Sciences, Organizational Economics & Management Keywords: habitual entrepreneurship; entrepreneurial process; novice entrepreneurs
Online: 17 October 2018 (06:07:08 CEST)
Habitual entrepreneurs became an important group of entrepreneurs who make a large contribution to the process of wealth creation. Previous studies have indicated that habitual entrepreneurship is a widespread phenomenon. It highlights the need to focus on the habitual entrepreneurs to understand the dynamic aspect of entrepreneurship. The aim of the article is to present the results of preliminary research on the relationships of entrepreneurial behavior of novice entrepreneurs and habitual entrepreneurs, in particular their motivation and impact on the entrepreneurial process. Research were conducted on a sample of 373 small innovative enterprises in the fourth quarter of 2017. The scale of habitual entrepreneurs in the examined sample is 32.44% and is comparable with research carried out in other countries. Survey results reveal that habitual entrepreneurs have greater tendency to create opportunities (introduction new products or services) than novice entrepreneurs. Habitual entrepreneurs also have higher development rate (increase of turnover) and larger share in the international market than novice entrepreneurs. The article also contains indications for further research into the phenomenon of multiple entrepreneurship.
ARTICLE | doi:10.20944/preprints201808.0177.v1
Subject: Earth Sciences, Environmental Sciences Keywords: morphological indices; urban climate; planning process
Online: 9 August 2018 (06:21:33 CEST)
The purpose of this article is to analyze urban form through the mapping of morphological indices, namely impervious surface fraction, building density, verticality, height/width ratio, roughness length, and porosity, to support urban planning in the city of João Pessoa, PB, in northeastern Brazil. The application of this study identifies and calculates such significant indices for the city's urban space from a Geographic Information System (GIS) model. The spatial indices play notable roles in climate at different scales, developing guidelines to maximize environmental quality, promote improvements to thermal comfort, minimize the urban heat island in the city of João Pessoa, and provide relevant data (considering microclimate aspects), guiding decisions related to the planning process.
ARTICLE | doi:10.20944/preprints201801.0056.v1
Subject: Materials Science, General Materials Science Keywords: intermediate heat treatment; boron; fabrication process
Online: 8 January 2018 (09:36:45 CET)
In this study, we evaluated the cold drawing workability of two kinds of modified 9Cr-2W steel containing different contents of boron and nitrogen, depending on the temperature and time of normalizing and tempering treatments. Using ring compression tests at room temperature, the effect of intermediate heat treatment condition on workability was investigated. It was found that the prior austenite grain size can be changed by the austenite transformation, and the grain size increases with increasing temperature during normalizing heat treatment. Alloy B and Alloy N showed different patterns after normalizing heat treatment. Alloy N had higher stress than Alloy B, and the reduction in alloy N increased, while the reduction in alloy B decreased. Alloy B showed a larger number of initially formed cracks and a larger average crack length than Alloy N. Crack length and number increased proportionally in Alloy B as the stress increased. Alloy B had lower crack resistance than Alloy N due to boron segregation.
ARTICLE | doi:10.20944/preprints201707.0002.v1
Subject: Engineering, Biomedical & Chemical Engineering Keywords: Bioreactor, Taguchi, Prodigiosin, Serratia, Process Development
Online: 3 July 2017 (17:13:27 CEST)
One of the major steps toward the industrialization of microbial product(s) is to optimize the cultivation conditions at the large scale bioreactor and successfully control the microbial behavior within large scale production environment. Statistical Design of Experiment was proven to optimize a vast number of microbial processes to achieve robustness and explore possible interactions among the variables. In this research, Taguchi Orthogonal Array was applied to optimize the cultivation condition of a newly isolated Prodigiosin-producing marine bacterial strain, Serratia AM8887, at bioreactor level. Two steps fermentation process was applied; as the productivity was scaled up from shake flask level to a bench top bioreactor (5L) and subsequently to an in-situ sterilization bioreactor system (20L) leading to a yield of 7g/L compared to 100mg/L prior to optimization confirming that; applying Taguchi experimental design is a reliable and good positive option for the optimization of biotechnological processes.. The produced pigment was purified and the chemical structure was revealed by means of Spectrophotometric, Maas Spectrum (MS), Fourier transform infrared (FT-IR), and proton nuclear magnetic resonance (1H-NMR) spectroscopy analysis. The biological activity including antibacterial, antioxidants and cytotoxicity to cancer cells line of the pigment were explored. The pigment showed very characteristic features that could helpful in food, pharmaceuticals and/or textile industries.
ARTICLE | doi:10.20944/preprints201612.0148.v1
Subject: Engineering, Industrial & Manufacturing Engineering Keywords: process improvement; ewe’s cheese; systemic design
Online: 30 December 2016 (07:34:54 CET)
The research reported on this paper was aimed at improving the overall efficiency of a PDO certified artisanal cheese production process. Being a PDO certified foodstuff by the EU, it is considered to have properties and qualities determined by the geographical environment in which is made, with its production taking place in a specific and determined geographical location, in this case the Serra da Estrela region. In that sense, the authors conducted a mapping according to a systemic perspective of the processes involved in the context of manufacturing and distribution of certified Serra da Estrela cheese. Numerous methods were used throughout the process, such as a systemic design analysis, and techniques derived from ethnographic methods, which led to the collection of data in the field and consequently provided the immersion of the researcher in genuine work situations. Critical points were identified and emphasized in the systemic map with the purpose of encouraging initiatives to address and overcome the gaps and inefficiencies detected. The systemic design analysis triggered the development of design work. Observations following an ethnographic approach identified ergonomic risks in cheese making during the process of cutting excess chips, fostering the emergence of musculoskeletal disorders at the wrist. A tool that fits best to the task at hand was developed. A prototype of the new tool enabled collecting feedback from use in the work context, in order to inform product development. The domains of agricultural production and microbiology, considering the specific microorganisms developed trough the ripening process of the cheese, turned out to be aspects of high importance for the issue under focus, contributing to a broader understanding of the ripening process and its risks, simultaneously improving the efficiency and success of cheese production. If it were not for the systemic analysis, which served as a link between the boundaries of distinct domains such as the risk of microorganism contamination, ergonomics, energy efficiency, legislation and regulation policies, transportation challenges and economic viability, approached throughout the research. Simultaneously creating bridges between them, the various problems might not have been detected in the first place, as they are usually addressed in specialized disciplines, predetermined by the restrictions of each specific area of knowledge. As a consequence of the development of this research, which was based on an analysis that sought to establish possible connections between various disciplines and tried to constantly maintain a holistic perspective, making new connections and observing the issues from a new angle, apart from the already established methodologies. All this allowed to lay out the seeds for the development of a plan to tackle the critical points identified by the systemic analysis reported in this paper.
REVIEW | doi:10.20944/preprints202207.0458.v1
Subject: Biology, Agricultural Sciences & Agronomy Keywords: probiotic bacteria; oxalate-degrading; variables; in vivo; in vitro; oxaluria
Online: 29 July 2022 (10:00:53 CEST)
Oxalate, a compound produced by many edible plants and as a terminal metabolite in the liver of mammals, is a toxin that has a detrimental role to human health. Humans and other mam-mals do produce the enzymatic machinery to degrade oxalate. However, numerous oxa-late-degrading bacteria reside in the mammalian gut and thus provide an important function for hosts. The current review focuses on the environmental factors that influence the efficacy of pro-biotic oxalate-degrading bacteria, relative to oxalate metabolism. We describe the mechanism of oxalate catabolism and its consumption by obligate and facultative anaerobic oxalate-degrading bacteria, in both in vitro and in vivo environments. We also explore the environmental variables that impact oxalate degradation. Studies on single species degrade oxalate have not shown a strong impact on oxalate metabolism especially in high oxalate conditions such as consumption of foods high in oxalate (such as coffee and chocolate for humans or halogeton in animal feed). Considering effective variables which enhance oxalate degradation could be used in application of effective probiotic as a therapeutic tool in individuals with hyperoxaluria. This study indicates probiotics can be considered a good source of naturally occurring oxalate degrading agent in human colon
Subject: Life Sciences, Genetics Keywords: gene doping; gene therapy; in vivo transfection; in vivo imaging
Online: 3 June 2020 (05:46:32 CEST)
The World Anti-Doping Agency has prohibited gene doping in the context of progress in gene therapy. There is a risk that the artificial regulation of genes using plasmids could be applied for gene doping. However, no gold standard method to detect this has been established. Here, we aimed to develop a method to detect multiple transgene fragments as proof of gene doping. First, gene delivery model mice as a mimic of gene doping were created by injecting firefly luciferase plasmid with polyethylenimine (PEI) into the abdominal cavity. The results confirmed successful establishment of the model, with sufficient luminescence upon in vivo imaging. Next, multiple transgene fragments in the model were detected in plasma cell-free (cf)DNA, blood-cell-fraction DNA, and stool DNA using the TaqMan-qPCR assay, with the highest levels in plasma cfDNA. Using just a single drop of whole blood from the model, we also attempted long-term detection. The results showed that multiple transgene fragments were detected until 11 days. These findings indicate that the combination of plasma cfDNA or just one drop of whole blood with TaqMan-qPCR assay is feasible to detect plasmid-PEI-based gene doping. Our findings could accelerate the development of methods for detecting gene doping in humans.
REVIEW | doi:10.20944/preprints202210.0148.v1
Subject: Medicine & Pharmacology, Pharmacology & Toxicology Keywords: drug development; high throughput screening; in vivo/in silico screening; zebrafish
Online: 11 October 2022 (10:33:33 CEST)
Introduction: The combination of Virtual Screening (VS) techniques with in vivo screening in the zebrafish model is currently being used in tandem for drug development in a faster and more efficient way. Areas covered: We review the different virtual screening techniques, the use of zebrafish as a vertebrate model for drug discovery and the synergy that exists between them. Expert opinion: We highlight the advantages of combining virtual and zebrafish larvae screening for drug discovery. On the one hand, VS is a faster and cheaper tool for searching active compounds and possible candidates for therapy than in vivo screening when processing large compound libraries. On the other hand, zebrafish larvae form a vertebrate model which allows in vivo screening of large amounts of the compounds. Importantly physiology and chemical response are mostly conserved between zebrafish and mammals. The availability of the transgenic and mutant zebrafish lines allows an analysis of a specific phenotype upon treatment along with toxicity, off-target effect, side effects, and dosage. Advantages of VS, in vivo whole animal approach screening, and the screening combinations are also reviewed.
REVIEW | doi:10.20944/preprints202009.0756.v2
Subject: Medicine & Pharmacology, Allergology Keywords: 3D in vitro models; eye research; in silico analysis; eye anatomy
Online: 26 October 2020 (11:45:00 CET)
Human eye is a specialized organ with complex anatomy and physiology, because it is characterized by different cell types with specific physiological functions. Given the complexity of the eye, ocular tissues are finely organized and orchestrated. In the last few years many in vitro models have been developed, in order to meet the 3Rs principle (Replacement, Reduction and Refinement) for eye toxicity testing. This procedure is highly necessary to ensure that the risks associated with ophthalmic products meet appropriate safety criteria. In vitro preclinical testing is now a well-established practice of significant importance for evaluating the efficacy and safety of cosmetic, pharmaceutical, and nutraceutical products. Along with in vitro testing, also computational procedures, herein described, for evaluating the pharmacological profile of potential ocular drug candidates including their toxicity, are in rapid expansion. In this review the ocular cell types and functionality are described providing an overview about the scientific challenge for the development of three-dimensional in vitro models.
ARTICLE | doi:10.20944/preprints202210.0186.v2
Subject: Life Sciences, Biophysics Keywords: neural; brain; structure; function; process; cell expression
Online: 18 October 2022 (02:37:36 CEST)
This paper describes some neural representations that may be helpful for realising intelligence in the human brain. The ideas come from the author's own cognitive model, where a number of algorithms have been developed over time. Through developing and trying to implement the architecture, ideas like separating the data from the function have become architecturally appropriate and there have been several opportunities to make the system more orthogonal. Similarly for the human brain, neural structures may work in-sync with the neural functions, or may be slightly separate from them. Each section discusses one of the neural assemblies with a potential functional result, that cover ideas such as timing or scheduling, inherent intelligence and neural binding. Another aspect of self-representation or expression is interesting and may help the brain to realise higher-level functionality based on these lower-level processes.
ARTICLE | doi:10.20944/preprints202204.0138.v1
Subject: Mathematics & Computer Science, Other Keywords: API; clickstream; cloud applications; process mining; scripting
Online: 15 April 2022 (07:37:06 CEST)
Background: Process mining (PM) exploits event logs to obtain meaningful information about the processes that produced them. As the number of applications developed on cloud infrastructures is increasing, it becomes important to study and discover their underlying processes. However, many current PM technologies face challenges in dealing with complex and large event logs from cloud applications, especially when they have little structure (e.g., clickstreams). Methods: Using Design Science Research, this paper introduces a new method, called Cloud Pattern API – Process Mining (CPA-PM), that enables discovering and analyzing cloud-based application processes using PM in a way that addresses many of these challenges. CPA-PM exploits a new application programming interface (API), with an R implementation, for creating repeatable scripts that preprocess event logs collected from such applications. Results: Applying CPA-PM to a case with real and evolving event logs related to the trial process of a Software-as-a-Service cloud application led to useful analyses and insights, with reusable scripts. Conclusion: CPA-PM helps producing executable scripts for filtering event logs from clickstream and cloud-based applications, where the scripts can be used in pipelines while minimizing the need for error-prone and time-consuming manual filtering.
ARTICLE | doi:10.20944/preprints202110.0259.v1
Subject: Mathematics & Computer Science, Information Technology & Data Management Keywords: Blockchain technology; Process authenticity; Tokens; Anchors; Oracles
Online: 18 October 2021 (15:54:39 CEST)
In the last four years, the evolution and adoption of blockchain and, more generally, distributed ledger systems have shown the affirmation of many concepts and models with significant differences in system governance and suitable applications. This work aims to update the critical analysis of blockchain technologies carried out by our previous contribution to this journal, extending the focus to distributed ledger components and systems. Starting from the topical concept of decentralization, we introduce concepts and building blocks currently adopted in the available systems centering on their functional aspects and impact on possible applications. We present some conceptual framing tools helpful in the application context, and we will propose the concept of process authenticity, which we will discuss through two use cases: blockchain document dematerialization and e-voting.
ARTICLE | doi:10.20944/preprints202109.0006.v1
Subject: Mathematics & Computer Science, Information Technology & Data Management Keywords: semantics; process cycle; subjectivity; quantum cognition; qubit
Online: 1 September 2021 (11:24:31 CEST)
The paper describes a model of subjective goal-oriented semantics extending standard "view-from-nowhere" approach. Generalization is achieved by using a spherical vector structure essentially supplementing the classical bit with circular dimension, organizing contexts according to their subjective causal ordering. This structure, known in quantum theory as qubit, is shown to be universal representation of contextual-situated meaning at the core of human cognition. Subjective semantic dimension, inferred from fundamental oscillation dynamics, is discretized to six process-stage prototypes expressed in common language. Predicted process-semantic map of natural language terms is confirmed by the open-source word2vec data.
ARTICLE | doi:10.20944/preprints202105.0464.v1
Subject: Social Sciences, Accounting Keywords: Remittances; democrac; election process; Bangladesh; labour migrants
Online: 20 May 2021 (09:41:25 CEST)
This paper examines how remittances contribute to the democratisation process in Bangladesh. The endogeneity issue between remittances and democracy is tackled by employing the Structural VAR (SVAR) approach. It is found that while remittances respond to innovations in the macro-political variables, remittances also have important impact on these variables. Our results build a synergy between two opposing findings in the politics literature where on one hand remittances flows stabilise autocracies, while on the other hand they foster the prospect for democratisation. In particular, we demonstrate that a shock in remittances flows will have a negative but transitory impact on democracy. Initially there will be a bout of autocratic episodes which will be eventually eliminated and democracy will be restored to its original level in three to five years. However, using an alternative measure for democracy with the aid of principal-component analysis, we find that after the fifth year following a shock in remittances flows, a small but positive permanent effect on democracy is observable that do not revert to zero at end of the ten period horizon.
ARTICLE | doi:10.20944/preprints202103.0308.v1
Subject: Engineering, Automotive Engineering Keywords: discrete-impulse energy; hydromechanic; process; milk products
Online: 11 March 2021 (10:52:37 CET)
The basis of the discrete-impulse energy supply (DIES) concept is the efficient use of supplied energy. The references describe in detail the general principles of DIES, examine the energy and thermodynamic aspects and the main mechanisms of intensification that can be initiated on the basis of this principle. DIES mechanisms conveniently can be divided into hard and soft ones. The former should be used to stimulate hydromechanical processes, and the latter to accelerate the processes of phase heat and mass transfer, or for the purpose of intensive mixing of multicomponent media. The authors have studied the possibility of using DIES to intensify the hydromechanical processes, in particular emulsification of milk fat (homogenization of milk, preparation of spreads), processing of cream cheese masses. Objects of research were whole non-homogenized milk, fat emulsions, cream cheese mass. In order to evaluate the efficiency of milk homogenization the homogenization coefficient change was studied, which was determined by centrifugation method as the most affordable and accurate one. Emulsions were evaluated according to the degree of destabilization, resistance and dispersion of the fat phase. The rheological characteristics of cheese masses were evaluated by the effective viscosity change.
ARTICLE | doi:10.20944/preprints202011.0412.v1
Subject: Biology, Anatomy & Morphology Keywords: Process; ontological category; life concept; essential feature
Online: 16 November 2020 (10:49:11 CET)
Although increasing knowledge about biological systems has advanced exponentially in recent decades, it is surprising to realize that the very definition of Life keeps presenting theoretical challenges. Even if several lines of reasoning seek to identify the essence of life phenomenon, most of these thoughts contain fundamental problem in their basic conceptual structure. Most concepts fail to identify necessary and sufficient features to define life. Here, we analyzed the main conceptual framework regarding theoretical aspects supporting life concepts, such as (i) the physical, (ii) the cellular and (iii) the molecular approaches. Based on ontological analysis, we propose that Life should not be positioned under the ontological category of Matter. Yet, life should be better understood under the top-level ontology of “Process”. Exercising an epistemological approach, we propose that the essential characteristic pervading each and every living being is the presence of organic codes. Therefore, we explore theories in biosemiotics in order to propose a clear concept of life as a macrocode composed by multiple inter-related coding layers. Therefore, we suggest a clear distinction between the concept of life and living beings, a distinction that is not evident in theoretical terms. From the proposed concept, we suggest that the evolutionary process is a fundamental characteristic for life’s maintenance but not to its definition. The current proposition opens a fertile field of debate in astrobiology, biosemiotics and robotics.
ARTICLE | doi:10.20944/preprints202003.0355.v1
Subject: Engineering, Control & Systems Engineering Keywords: Thermo-Fluidic Process; Inkjet Printing; Feedback Control
Online: 24 March 2020 (08:30:10 CET)
This paper introduces a closed-loop control strategy for maintaining consistency of ink temperature in commercial Drop on Demand (DoD) inkjet printing. No additional sensors or additional actuators are installed in the printhead while achieving the consistency in ink temperature. To this end, this paper presents a novel in situ sensing-actuation policy at every individual ink-nozzle, where the jetting mechanism has three distinct roles. It is used for jetting liquid droplet onto the print media based on the print-job. It is used as a softsensor to estimate the real-time liquid temperature of the jetting nozzle. While not jetting liquid, it is used as a heating actuator to minimize the gradient of liquid temperature among jetting nozzles. The soft-sensing based in situ controller is implemented in an experimentally validated digital twin that models the thermo-fluidic processes of the printhead. The digital twin is scalable and flexible to incorporate an arbitrary number of inknozzles, making the control strategy applicable for future designs of the printhead.
ARTICLE | doi:10.20944/preprints201907.0348.v1
Subject: Engineering, Other Keywords: rotorcraft; machine learning; Gaussian process; flight simulation
Online: 31 July 2019 (04:55:48 CEST)
Physical-law based models are widely utilized in the aerospace industry. One such use is to provide flight dynamics models for use in flight simulators. For human-in-the-loop use, such simulators must run in real-time. Due to the complex physics of rotorcraft flight, to meet this real-time requirement, simplifications to the underlying physics sometimes have to be applied to the model, leading to model response errors in the predictions compared to the real vehicle. This study investigated whether a machine-learning technique could be employed to provide rotorcraft dynamic response predictions, with the ultimate aim of this model taking over when the physics-based model's accuracy degrades. In the current work, a machine-learning technique was employed to train a model to predict the dynamic response of a rotorcraft. Machine learning was facilitated using a Gaussian Process (GP) non-linear autoregressive model, which predicted the on-axis pitch rate, roll rate, yaw rate and heave responses of a Bo105 rotorcraft. A variational sparse GP model was then developed to reduce the computational cost of implementing the approach on large data sets. It was found that both of the GP models were able to provide accurate on-axis response predictions, particularly when the input contained all four control inceptors and one lagged on-axis response term. The predictions made showed improvement compared to a corresponding physics-based model. The reduction of training data to one-third (rotational axes) or one-half (heave axis) resulted in only minor degradation of the GP model predictions.
Subject: Earth Sciences, Environmental Sciences Keywords: peatland; Central Spain; anthropogenic process; genesis; evolution
Online: 26 March 2019 (10:17:09 CET)
This paper constitutes a first advance in the paleoenvironmental study of a small group of peatland ecosystems, of reduced size, located in the interior of the Iberian Peninsula (Puebla de Don Rodrigo, Ciudad Real, Spain). It represents a singular enclave, because these ecosystems are home to the southern-most peatlands in Europe, located at the lowest altitude in Spain, and are unique to the region of Castilla-La Mancha. They form ecosystems similar to the peat bogs of northern Europe, but in a Mediterranean climate. The analysis is followed of sample collection and data collection from documentary, textual, and cartographic sources drawn up since the 16th century up until the present day. The scientific analyses that were performed and the documentation that was consulted point to the hypothesis that these peatlands are the result of a long process of historical modification of the landscape in which anthropogenic activity has played a leading role, triggering a series of processes on the hillslopes that are culminating in the exhumation of the waterlogged areas, thereby establishing a recent genesis of these ecosystems.
ARTICLE | doi:10.20944/preprints201902.0202.v1
Subject: Earth Sciences, Other Keywords: CaCO3 polymorphs; sulphate; ageing process; aragonite; gypsum
Online: 21 February 2019 (10:45:12 CET)
In this work we aim to experimentally study the nucleation and growth of CaCO3 phases precipitated from supersaturated aqueous solutions in the presence of varying concentrations of sulphate oxyanion. The experiments were conducted under pH conditions close to neutral (7.6) and considering a wide range of initial (SO42-)/(CO32-) ratios (0 to ~ 68) in the aqueous solution. We paid special attention to the evolution of the precipitates during ageing within a time framework of 14 days. The mineralogy, morphology and composition of the precipitates were studied by X-ray diffraction, Fourier transform infrared spectroscopy, scanning electron microscopy and EDX microanalysis. The concentration of sulphate ions in the reacted aqueous solution was study by ICPs. The experimental results show that the mineral composition of the precipitate recovered in each run varied with the (SO42-)/(CO32-) ratio in the parental solution, which influences the mineral evolution of the precipitates during ageing. We observe that high concentrations of sulphate in the aqueous solution stabilize the vaterite precipitates and inhibit calcite formation. Furthermore, aragonite never precipitates directly form the solution and it is only formed via a dissolution-precipitation process in solutions with high (SO42-)/(CO32-) ratio after long reaction times. Finally, gypsum only precipitates after long ageing in those aqueous solutions with the highest concentration of sulphate. The reaction pathways during ageing, the morphology of the calcite crystals and the composition of vaterite and calcite are discussed considering both, kinetic and thermodynamic factors. These results show a considerably more complex behavior of the system than that observed in experiments conducted under higher pHs and supersaturation levels and lower (SO42-)/(CO32-) ratios in the aqueous phase.
ARTICLE | doi:10.20944/preprints201812.0169.v1
Subject: Life Sciences, Biochemistry Keywords: fungicides; dissipation; winemaking process; anthocyanins; antioxidant activity.
Online: 13 December 2018 (14:42:28 CET)
The effect of fungicides on fermentation is of paramount importance to control the quality and safety of wines. In this work, the quality (oenological parameters, color, phenolic content, antioxidant activity, and fungicide residues) of wines from Monastrell grapes fortified with iprovalicarb, mepanipyrim and tetraconazole fungicides was evaluated. Along of the winemaking process, initial residues of mepanipyrim and tetraconazole were removed in more than a 90 % while dissipation of iprovalicarb was around 73 %. Significant statistical differences were found in presence of iprovalicarb and mepanipyrim residues especially at the highest concentration assayed. For both fungicides, an increase of the volatile acidity (between 4 and 8.6 times), the lactic acid content (between 8.6 and 20.5 times), the percentage of polymeric anthocyanins (between 1.3 and 1.7 times) and also a slight increase of the total phenolic index and the total anthocyanins content determined by spectrophotometry was observed. On the contrary, the total monomeric anthocyanins content decreased about 16.3 and 28.6 % in presence of iprovalicarb and mepanipyrim, respectively. These results could be related with the addition of SO2 to the grape must and a higher development of acetic acid or lactic bacteria in presence of these fungicides. The color of the final wines was also different in comparison with the control, with a higher yellow component, color intensity, tonality and hue angle, because of pH changes in the medium. Tetraconazole fermentations had a more similar trend to the control wine, probably due to the lower concentration of this fungicide in the grape must at the initial time. No effects on the antioxidant activity was observed for anyone of the target fungicides. A multivariate statistical analysis was done to view interrelationships between different variables (color and anthocyanins profile). The obtained model allowed to separate wines according to the fungicide treatment applied.
ARTICLE | doi:10.20944/preprints201810.0672.v1
Subject: Engineering, Control & Systems Engineering Keywords: linear model predictive control; process control; stability
Online: 29 October 2018 (10:57:05 CET)
The goal of this contribution is an application of the Linear General Model Predictive Control (LGMPC). In this paper, stability of the LGMPC is proven by means of a demonstration of a Theorem stating a sufficient and constructive condition. This condition can be applied for calculating the weight matrices of the cost function in the optimisation problem in LGMPC. Lower bounds conditions are found for one of these matrices and then a system with saturation is taken into consideration. The conditions could be interpreted and discussions through physical aspects. The obtained results were tested by means of computer simulations and an example with a recover water process is considered.
ARTICLE | doi:10.20944/preprints201808.0541.v1
Subject: Engineering, Other Keywords: anaerobic process; biogas; coffee wastewater; digester construction
Online: 31 August 2018 (05:47:42 CEST)
Wet coffee processing methods will produce wastewater containing organic matter. The high content of organic matter can be utilized as biogas through the anaerobic process. Anaerobic digesters construction can affect removal process of wastewater pollution and biogas quantity. The purpose of this study is to determine the effect of digester construction between conventional digester, CSTR and UASB producing biogas from coffee wastewater. The conventional digester worked without temperature control system as control, a UASB digester, and CSTR digester worked with temperature control system. Biomass volume was about 5 L with 35 days incubation time. Temperature and pH for UASB and CSTR were set within the range 30 – 35oC and pH 6.0 – 8.0. Based on the feeding variations, UASB has a stable performance with 83.57 ml/day of average biogas production. It has also highest remediation efficiency of COD, BOD and C/N with 85.00±0.34 %, 84.40%± 5.66 and 97.78± 0.57.
ARTICLE | doi:10.20944/preprints202112.0234.v1
Subject: Engineering, Industrial & Manufacturing Engineering Keywords: submerged arc; heat resistant steel; square waveform welding; aggregate quality index; bay area; melting efficiency; process; model; process map
Online: 14 December 2021 (12:46:10 CET)
The demand for efficient processes through a comprehensive understanding and optimization of welding conditions continues to grow in the manufacturing industry. This study involves heat-resistant 2.25 Cr-1 Mo V-groove steel welding using the square-waveform alternating cur-rent. Experiments were conducted to build the relationship between input variables—such as current, frequency, electrode negativity ratio, and welding speed—and process performance, such as penetration, bay area, deposition rate, melting efficiency, percentage dilution, flux–wire ratio, and heat input. The process was analyzed in light of the defect-free high-deposition weld groove weld, the sensitivity to process parameters, and the optimization and development of the process map. The study proposes an innovative approach to reducing the cost and time of optimizing the one-pass-each-layer V-groove welding process using bead-on-plate welds. Square waveform welding creates a metallurgical notch in the form of a bay at the fusion boundary that can be minimized by selecting appropriate welding conditions. The square waveform submerged arc welding is more sensitive towards changes in current and welding speed than the frequency and electrode negativity ratio; however, the electrode negativity ratio and frequency are minor but helpful parameters to achieve optimal results. The proximity of the planned and experimental results to within 3% confirms the validity of the proposed approach. The investigation shows that 90% of the maximum deposition rate is possible for one-pass-each-layer V-groove welds within heat input and weld width constraints.
ARTICLE | doi:10.20944/preprints202202.0098.v1
Subject: Social Sciences, Other Keywords: Paleoclimatic variations in Southern Peru; Early human occupations in Tacna - Peru; Ethnoarchaeological analysis of the Populations in the Western Andes; Development of the Puquina; Culture in the Collisuyu territory
Online: 7 February 2022 (16:11:17 CET)
The Environmental interactions contributed to the processes of settlement and development of the first settlers at southern Peru 15,000 BC. The determination of this process is based on paleoclimatic studies at southern Peru and ethnoarchaeological evidence. The determination of this process is based on paleoclimatic studies of southern Peru and ethnoarchaeological evidence; establishing as a result of this analysis the hypothesis of environmental interaction and territorial occupation of the first hunter-gatherer populations until their consolidation, dominating the geographical space from Tacna. The Disruptive disaster events and their effects on climate during the Peruvian Paleolithic resulted in the origin of local societies that shaped the Tiawanaku societies of the early horizon at Southern Peru and Northern Chile.
ARTICLE | doi:10.20944/preprints202108.0102.v2
Subject: Chemistry, Analytical Chemistry Keywords: Oxygen generator; Carbon dioxide; Sabatier process; Mars; Moon
Online: 9 January 2023 (08:16:30 CET)
In the near future, the next generation of space travel will be revealed and people are going to live on multiple planets. As a result, because having enough oxygen for human survival is mandatory, in this research article, we are going to find out an appropriate method for producing oxygen on another planet by considering the accessible resources and their limitations.
ARTICLE | doi:10.20944/preprints202208.0006.v1
Subject: Engineering, Industrial & Manufacturing Engineering Keywords: Management; Improvement; System; Production process; Technologies; management methods
Online: 1 August 2022 (05:22:28 CEST)
Managers are often faced with the task of improving the management of the production 17process in order to maintain the sustainability of production efficiency in a highly competitive en-18vironment. The submitted contribution deals with the design of a system that will support them in 19the selection of progressive methods and technologies in order to improve the management of pro-20duction processes. Managers often follow new trends in this issue, but it is not easy to work on the 21knowledge that proven progressive technologies can bring them higher efficiency. The presented system is based on the use of knowledge of existing solutions of manufacturers of automotive components, where companies solve similar problems in production management. The mix of methods and technologies used in the management of production processes brings companies different results in the achieved efficiency. The proposed system for supporting the selection of the production management method and technology is designed from blocks of data collection, benchmarking of the performance of production processes of industrial enterprises, and further from data mining technology to obtain knowledge about the effect on efficiency from already implemented technologies. The last blocks help to examine the differences in the implementation of the same methods and technologies and allow to present the obtained results to the manager in the form of recommendations for choosing a suitable progressive method and technology.
ARTICLE | doi:10.20944/preprints202108.0248.v1
Subject: Mathematics & Computer Science, Probability And Statistics Keywords: Random fields; warped Gaussian Process; Spatial field reconstruction
Online: 11 August 2021 (10:39:35 CEST)
A class of models for non-Gaussian spatial random fields is explored for spatial field reconstruction in environmental and sensor network monitoring. The family of models explored utilises a class of transformation functions known as the Tukey g-and-h transformations to create a family of warped spatial Gaussian process models which can support various desirable features such as flexible marginal distributions, which can be skewed, leptokurtic and/or heavy-tailed. The resulting model is widely applicable in a range of spatial field reconstruction applications. To utilise the model in applications in practice, it is important to carefully characterise the statistical properties of the Tukey g-and-h random fields. In this work, we both study the properties of the resulting warped Gaussian processes as well as using the characterising statistical properties of the warped processes to obtain flexible spatial field reconstructions. In this regard, we derive five different estimators for various important quantities often considered in spatial field reconstruction problems. These include the multi-point Minimum Mean Squared Error (MMSE) estimators; the multiple point Maximum A-Posteriori (MAP) estimators; an efficient class of multiple-point linear estimators based on the Spatial-Best Linear Unbiased (S-BLUE) estimators; and two multi-point threshold exceedance based estimators, namely the Spatial Regional and Level Exceedance estimators. Simulation results and real data examples show the benefits of using the Tukey g-and-h transformation as opposed to standard Gaussian spatial random fields in a real data application for environmental monitoring.
REVIEW | doi:10.20944/preprints202006.0110.v1
Online: 7 June 2020 (16:42:16 CEST)
Perovskite based solar cells have achieved tremendous progress reaching record efficiency in the past 5 years. Numerous new processes and chemistry have been reported and contribute to the perovskite rapid progress. Continuous efficiency improvements are still necessary for perovskite solar cells, and an exploratory data analysis on devices performance over multiple studies could boost the technology development. Such analysis could identify patterns or provide insights that are not obvious in a single study. Here we present a high quality dataset containing only independently certified Pb-based perovskite solar cells summarizing their efficiency, relevant I-V metrics, manufacturing processes and materials used. Analysis over the dataset provides insights on how aperture size, perovskite deposition methods and materials used in each functional layer affect the final solar cell efficiency and I-V metrics. Future directions are also suggested for efficiency improvements.
ARTICLE | doi:10.20944/preprints202003.0396.v2
Subject: Engineering, Mechanical Engineering Keywords: production process design; design for manufacturability; fuzzy logic
Online: 27 March 2020 (13:01:50 CET)
The paper presents design methodology for the production process of a new product from the point of view of the assembly operations technology criterion (Design for Assembly - DFA) in the conditions of high-volume production. Mentioned are DFA methods and techniques used in the implementation of a new product. Author presents a new method to assess design for manufacturability based on fuzzy variables based on fuzzy variables. An example was given to illustrate the proposed course of action
ARTICLE | doi:10.20944/preprints202002.0008.v1
Subject: Engineering, General Engineering Keywords: ceramic membrane; combination process; microfiltration; optimization; recovery efficiency
Online: 3 February 2020 (03:58:04 CET)
The aim of this study is to evaluate the optimal conditions of membrane filtration process. Both laboratory test and pilot-scale test were conducted to examine a treated water on blending water. The water sample were prepared by blending a raw water and the effluent water filtered through an organic membrane. The optimal efficiency in the treatment of water quality at the lab-scale test was generated under conditions of flux at 2.0 m3/m2∙day, the blending ratio of 4:1, and the optimal dosage of coagulant at 20 ppm. The pilot-scale test resulted in that the optimal efficiency was obtained under conditions of flux at 2.0 m3/m2∙day and the blending ratio of 6.0:1. However, the different results between lab-scale and pilot-scale tests on the optimal dosage of coagulant implied that it is difficult to achieve the stable condition of process operation at the low level of coagulant. In summary, the results indicated that, in the combination process of organic membrane and ceramic membrane, the recovery efficiency was achieved above the level of 98.4 %. Compared to 92.1 % in a single organic membrane process, the combination process is 6.3 % more efficient than the single one. This combination process of water treatment lead to stable recovery rates by the optimal input of dosage, less pollution load to water, and a stabilized filtration system.
ARTICLE | doi:10.20944/preprints202001.0065.v1
Subject: Mathematics & Computer Science, Probability And Statistics Keywords: simulation; binomial distribution; Poisson distribution; stochastic process; modelling
Online: 8 January 2020 (07:43:23 CET)
This study has developed a Matlab application for simulating statistical models project (SMp) probabilistic distributions that are similar to binomial and Poisson, which were created by mathematical procedures. The simulated distributions are graphically compared with these popular distributions. The application allows to obtain many probabilistic distributions, and shows the trend (τ ) for n trials with success probability p, i.e. the maximum probability as τ=np. While the Poisson distribution PD(x;µ) is a unique probabilistic distribution, where PD=0 in x=+∞, the application simulates many SMp(x;µ,Xmax) distributions, where µ is the Poisson parameter and value of x with generally the maximum probability, and Xmax is the upper limit of x with SMp(x;µ,Xmax) ≥ 0 and limit of the stochastic region of a random discrete variable. It is shown that by simulation via, one can get many and better probabilistic distributions than by mathematical one.
ARTICLE | doi:10.20944/preprints202001.0014.v1
Subject: Mathematics & Computer Science, Probability And Statistics Keywords: simulation; binomial distribution; poison distribution; stochastic process; modelling
Online: 2 January 2020 (05:19:27 CET)
This study has developed a Matlab application for simulating statistical models project (SMp) probabilistic distributions that are similar to binomial and Poisson, which were created by mathematical procedures. The simulated distributions are graphically compared with these legendary distributions. The application allows to obtain many probabilistic distributions, and shows the trend (τ ) for n trials with success probability p, i.e. the maximum probability as τ=np. While the Poisson distribution PD(x;µ) is a unique probabilistic distribution, where PD=0 in x=+∞, the application simulates many SMp(x;µ,Xmax) distributions, where µ is the Poisson parameter and value of x with generally the maximum probability, and Xmax is upper limit of x with SMp(x;µ,Xmax) ≥ 0 and limit of the stochastic region of the random discrete variable X. It is shown that by simulation via, one can get many and better probabilistic distributions than by mathematical one.
ARTICLE | doi:10.20944/preprints201912.0197.v1
Subject: Engineering, Biomedical & Chemical Engineering Keywords: chaotic advection; microfluidics; micromixer; numerical simulation; process intensification
Online: 15 December 2019 (15:31:48 CET)
In the present paper four passive micromixers designs (G1, G2, G3 and G4) inspired on distillation columns trays were proposed. The devices performance was assessed by numerical simulations. The mixing performance was investigated for a Reynolds number range from 0.01 to 100 and channel height of 200 µm, 500 µm and 1000 µm for oil/ethanol flow. G1 and G4 designs provided high mixing index (> 0.975). The G1 device achieved superior mixing performance with a moderate pressure drop (about 0.5 MPa) due to the induced flow recirculation pattern for a relative high flow rate of 0.21 L/min, highlighting the potential use of such microdevice for scale-up and numbering-up of microdevices in modular chemical plant processing.
ARTICLE | doi:10.20944/preprints201901.0091.v1
Subject: Engineering, Civil Engineering Keywords: Acoustic emissions, fracture process, failure prediction, q-statistics
Online: 9 January 2019 (16:35:10 CET)
In this paper we present experimental results concerning Acoustic Emission (AE) recorded during cyclic compression tests on two different kinds of brittle building materials, namely concrete and basalt. The AE inter-event times were investigated through a non-extensive statistical mechanics analysis which shows that their decumulative probability distributions follow q-exponential laws. The entropic index q and the relaxation parameter q 1=Tq, obtained by fitting the experimental data, exhibit systematic changes during the various stages of the failure process, namely (q; Tq) linearly align. The Tq = 0 point corresponds to the macroscopic breakdown of the material. The slope, including its sign, of the linear alignment appears to depend on the chemical and mechanical properties of the sample. These results provide an insight on the warning signs of the incipient failure of building materials and could therefore be used in monitoring the health of existing structures such as buildings and bridges.
ARTICLE | doi:10.20944/preprints201810.0567.v1
Subject: Materials Science, Polymers & Plastics Keywords: solution process; thin films; composite material; dielectric constant
Online: 24 October 2018 (10:48:45 CEST)
In this study is reported the optical, structural and dielectric properties of Poly (vinyl alcohol) thin films membranes with embedded ZnO nanoparticles (PVA/ZnO) obtained by solution casting method at low temperature of deposition. Fourier Transform Infrared spectra showed the characteristics peaks, which correspond to O-H and Zn-O bonds present in the hybrid material. The X-ray diffraction patterns indicated the presence of ZnO structure into the films. The composite material showed low absorbance and a wide band gap energy from 5.6 to 5.9 eV. The surface morphology for the thin films of PVA/ZnO was studied by Atomic Force Microscopy and Scanning Electron Microscopy. The dielectric properties of the nanocomposites were measured from low to high frequencies, the results showed a high dielectric constant (ε) in the order of 104 at low frequency and values from ε ≈ 2000 to 100 in the range of 1KHz-1MHz respectively, the properties of PVA/ZnO such as the high permittivity and the low temperature of processing make it a suitable material for potential applications in the development of flexible electronic devices.
REVIEW | doi:10.20944/preprints201806.0251.v1
Subject: Earth Sciences, Other Keywords: Geological process, geological materials, trace elements, environmental health
Online: 15 June 2018 (11:47:52 CEST)
Inadequate data linking geology and health in Developing Countries contributes immensely to the challenges to identify sources and causes of many of the emerging diseases. Deficiencies and toxicities of trace elements generally impact human and animal health. The review of the geology of Ghana suggests the presence of oxides and sulphide minerals that are released into the natural environment during the geological process of weathering which introduce both essential and, potentially harmful elements. Of great concern is the fact that majority of the Ghanaian population eat locally cultivated food and expect to be nourished from the diet. Furthermore, archived reports on Ghana rural drinking water indicates most of the aquifers are enriched in As and F while deficient in Mg. Medical geology, the emerging discipline that attempts to address the environmental health issues emanating from geological processes is known in developed nations but not much of its activities are recognized in Ghana. This review has identified the concentrations of a number of elements in different geological settings and have linked these concentrations with health issues. There is therefore a need for medical geologists to work together with other disciplines to devise preventive as well as mitigative techniques in addressing many geology related health issues in Ghana.
ARTICLE | doi:10.20944/preprints201709.0004.v1
Online: 1 September 2017 (18:01:32 CEST)
Museums and Science Centres are informal education environments that intend to engage the visitors with their exhibits. We present an efficient design process that allows an improved working relationship between museum practitioners, exhibition designers, and visitors. We present the principles and a graphical representation based on the Engagement Profile from previous work. Elements of the design process are evaluated using a learning game at the science centre Engineerium. The evaluation is based on a study with over five hundred visitors to the science centre.
ARTICLE | doi:10.20944/preprints201607.0050.v1
Subject: Engineering, Mechanical Engineering Keywords: Nanochannel; Molecular Dynamics Method; Nanoparticle, Argon; Boiling Process
Online: 18 July 2016 (10:37:40 CEST)
In this paper, the boiling flow inside a nanochannel with 700000 argon particle has been simulated by molecular dynamics (MD) simulation. This approach has been employed to analysis the superheated flow and its heat transfer pattern as well. For all simulations an external thrust force varying from 1 PN to 12 PN is exerted on inlet nanoparticles along the channel to have the forced annular boiling flow. Computations reveal that saturation condition and superheat degree have significant impacts on the liquid-vapor interface. Furthermore, because of the major influence of surface tension throughout a nanochannel, the x-velocity of liquid film and vapor core has not considerable fluctuations and stay smooth. All provided results show the behaviors completely similar to the available outcomes in the literature.
CONCEPT PAPER | doi:10.20944/preprints202301.0047.v1
Subject: Social Sciences, Education Studies Keywords: Zambia’s Higher Education Policy; Policies in Higher Education; Education and Policies in Zambia
Online: 4 January 2023 (02:46:40 CET)
Policies are adopted and implemented to achieve specific goals. In this regard, the Government of Zambia in 2019 adopted the higher education policy with many objectives which include, the expansion of access to higher education; reduction of gender and other forms of inequity in accessing higher education; the improvement of quality of higher education by increasing funding to higher education institutions, construction, and repair of infrastructure in higher and improvement in the provision of learning materials in higher education institutions (HEIs).This paper, therefore, sought to evaluate the higher education policy of 2019. The four criteria were used namely; effectiveness, equity, policy sustainability, and consistency. Other principles such as political and social acceptability of a policy were not used because the policy is already adopted and being implemented. Further, the principle of efficiency was not used because it would have been problematic to gauge the expenditure against the outcomes.The findings have revealed that the policy to some extent has been effective; some of the objectives such as increased access to higher education and reduction of inequality have been partially achieved. It has been established that the number of students pursuing higher education has been increased to 114,049 in 2020 from 91,969 in 2017. Further, in 2021, 48.5% of scholarships in public universities were awarded to female students while 51.5% were awarded to female students. Further, more students are encouraged to pursue science, technology, engineering, and mathematics (STEM); and more women are pursuing studies in STEM-related fields. However, the policy has not helped to achieve the quality of higher education as funding in public HEIs has not improved. Further, infrastructure remains very poor and inadequate; and access to update and relevant learning materials also remains a challenge in HEIs. However, the policy seems to be duplicated by the recent re-launched Technical Education Vocation Entrepreneurship Training (TEVET) national policy.Given the above findings, it has been recommended that the government among other things improve funding and build infrastructure in public HEIs. There is also a need to harmonise the TEVET national policy with the higher education policy.
REVIEW | doi:10.20944/preprints202007.0528.v1
Subject: Life Sciences, Other Keywords: In vitro; in vivo; animal model; Malassezia; infection; host-pathogen interaction; Galleria mellonella
Online: 22 July 2020 (11:34:57 CEST)
Malassezia is a lipid-dependent genus of yeasts known for being an important part of the skin mycobiota. These yeasts have been associated in the development of skin disorders and cataloged as a causal agent of systemic infections under specific conditions, making them opportunistic pathogens. Little is known about the host-microbe interaction of Malassezia spp., and unraveling this implies the implementation of infection models. In this mini review we present different models that have been implemented in the fungal infections study with greater attention in Malassezia spp. infections. These models range from in vitro (cell cultures and ex vivo tissue), to in vivo (murine models, rabbits, guinea pigs, insects, nematodes, and amoebas). We additionally highlight the alternative models that reduce the use of mammals as model organisms, which have been gaining importance in the study of fungal host-microbe interactions. This is due to the fact that these systems have shown to have reliable results, which correlate with those obtained from mammalian models. Example of alternative models are Caenorhabditis elegans, Drosophila melanogaster, Tenebrio molitor, and Galleria mellonella. These are invertebrates that have been implemented in the study of Malassezia spp. infections in order to identify differences in virulence between Malassezia species.
ARTICLE | doi:10.20944/preprints201908.0158.v1
Subject: Social Sciences, Business And Administrative Sciences Keywords: academic entrepreneurship; TTOs; full service KTOs; academic entrepreneurship in Bangladesh; patents in Bangladesh
Online: 14 August 2019 (03:13:18 CEST)
Academic entrepreneurship focuses on commercialization of research. Even though it is practiced worldwide for decades, Bangladesh is a newcomer in this segment. In Bangladesh only 2 Universities have Technology Transfer Offices or TTOs which are established with the sole focus of commercialization of researches of the students and the faculty members of the Universities. This article focuses on worldwide practices of technology transfer and academic entrepreneurial activities and also explores the opportunities and challenges of such entrepreneurs through detail investigation of the existing body of knowledge. Apart from exploring the problems and prospect of academic entrepreneurship in Bangladesh, this article also introduces the concept of Full-Service Knowledge Transfer Office (KTO), which existing literatures do not offer. The authors believe that by establishing such KTOs as a self-sustaining body, it is possible for an academic entrepreneur to stimulate, support and sustain their activities in Bangladesh. The concept of full service KTOs can also become models for other nations, specially the developing ones, to establish and nurture a culture of academic entrepreneurship.
ARTICLE | doi:10.20944/preprints201711.0110.v1
Subject: Life Sciences, Biotechnology Keywords: Fluorescent reporter; E2-Crimson; mouse embryonic stem cells; knock-in; in vivo imaging
Online: 16 November 2017 (17:46:53 CET)
Far-red fluorescent reporter genes can be used for tracking cells non-invasively in vivo using fluorescence imaging. Here, we investigate the effectiveness of the far-red fluorescent protein, E2-Crimson (E2C), for tracking mouse embryonic cells (mESCs) in vivo following subcutaneous administration into mice. Using a knock-in strategy, we introduced E2C into the Rosa26 locus of an E14-Bra-GFP mESC line, and after confirming that the E2C had no obvious effect on the phenotype of the mESCs, we injected them into mice and imaged them over 9 days. The results showed that fluorescence intensity was weak, and cells could only be detected when injected at high densities. Furthermore, intensity peaked on day 4 and then started to decrease, despite the fact that tumour volume continued to increase beyond day 4. Histopathological analysis showed that although E2C fluorescence could barely be detected in vivo at day 9, analysis of frozen sections indicated that all mESCs within the tumours continued to express E2C. We hypothesise that the decrease in fluorescence intensity in vivo was probably due to the fact that the mESC tumours became more vascular with time, thus leading to increased absorbance of E2C fluorescence by haemoglobin. We conclude that the E2C reporter has limited use for tracking cells in vivo, at least when introduced as a single copy into the Rosa26 locus.
Subject: Keywords: Health information technology in Nepal
Online: 14 October 2019 (10:38:53 CEST)
The purpose of this article is to review a two year long implementation of an electronic health record (EHR) system at the outpatient department of a private hospital in Kathmandu, possibly a first implementation of its kind in Nepal. Procedures: The strategy designed for EHR implementation was based on Professor John P. Kotter's work on successful change management. Main findings: We found that keeping a close watch on the social dynamics affecting adoption decisions among users of the system was crucial for implementing the EHR; this was partly because the project received limited support from an erratically changing hospital leadership and lacked an EHR system suited to the needs of the users. Conclusions: The implementation described in this article is a good lesson on moving from control to drift in information systems implementation, and we hope this work will be useful to health IT professionals working in the space of digital transformation in resource-constrained environments.
REVIEW | doi:10.20944/preprints202205.0229.v1
Subject: Materials Science, Nanotechnology Keywords: Nanomaterials; Nanotoxicology; Immunotoxicity; Genotoxicity; Epigenetics; Advanced in vitro models; In silico; Life Cycle Assessment
Online: 17 May 2022 (11:05:10 CEST)
The use of nanomaterials has been increasing in recent times, and they are widely used in industries such as cosmetics, drug, food, water treatment and agriculture. The rapid development of new nanomaterials demands a set of approaches to evaluate the potential toxicity and risks related to them. In this regard, nanosafety has been using and adapting already existing methods (toxicological approach), but the unique characteristics of nanomaterials demand new approaches (nanotoxicology) to fully understand the potential toxicity, immunotoxicity and (epi)genotoxicity. Also, new technologies, such as organ-on-chip and sophisticated sensors, are under development and/or adaptation. All the information generated is used to develop new in silico approaches trying to predict the potential effects of newly developed materials. The overall evaluation of how from the production to final disposition chain of nanomaterials is evaluated under Life Cycle Assessment (LCA), which is becoming an important element of nanosafety considering sustainability and environmental impact. In this review we give an overview of all these elements of nanosafety.
ARTICLE | doi:10.20944/preprints202109.0527.v1
Subject: Chemistry, Analytical Chemistry Keywords: Gold Nanoparticles; Hyaluronate-Thiol; In vitro; In vivo; Peritumoral; Cancer; Cytotoxicity; ICP-OES; Biodistribution
Online: 1 October 2021 (11:12:31 CEST)
Biofouling is the unwanted adsorption of cells, proteins, or intracellular and extracellular bio-molecules that can spontaneously occur on the surface of metal nanocomplexes. It represents a major issue in bioinorganic chemistry because it leads to the creation of a protein corona, which can destabilize a colloidal solution and result in undesired macrophage-driven clearance, consequently causing failed delivery of a targeted drug-cargo. Hyaluronic acid (HA) is a bioactive, natural mucopolysaccharide with excellent antifouling properties, arising from its hydrophilic and polyanionic characteristics in physiological environments which prevent opsonization. In this study, hyaluronate-thiol (HA-SH) (MW 10 kDa) was used to surface-passivate gold nanoparticles (GNPs) synthesized using a citrate reduction method. HA functionalized GNP complexes (HA-GNPs) were characterized using absorption spectroscopy, scanning electron microscopy, zeta potential, and dynamic light scattering. GNP cellular uptake and potential dose-dependent cytotoxic effects due to treatment were evaluated in vitro in HeLa cells using ICP-OES and Trypan blue and MTT assays. Further, we quantified the in vivo biodistribution of intratumorally injected HA functionalized GNPs in Lewis Lung carcinoma (LLC) solid tumors grown on the flank of C57BL/6 mice and compared localization and retention with nascent particles. Our results reveal that HA-GNPs show overall greater peritumoral distribution (**p<0.005, 3 days post-intratumoral injection) than citrate-GNPs with reduced biodistribution in off-target organs. This property represents an advantageous step forward in localized delivery of metal nano-complexes to the infiltrative region of a tumor, which may improve the application of nanomedicine in the diagnosis and treatment of cancer.
ARTICLE | doi:10.20944/preprints202101.0406.v1
Subject: Medicine & Pharmacology, Allergology Keywords: Adapted COVID-Stress Scales; Stress in Academic Professionals; Resilience to COVID stress in Academia
Online: 20 January 2021 (16:37:25 CET)
To mitigate the COVID-19 infection, many world governments endorsed the cessation of non-essential activities, such as the school attendance. Thereby, forcing the evolution of the teaching model to the virtual classroom. In the present work we show the application of a modified version of the adapted COVID-19 stress scales (ACSS) which also included teaching anxiety and preparedness, and resilience for academic professionals in Mexico, during the unprecedented transformation of the education system undergone in the COVID-19 quarantine. Most of the studied variables: gender, age, academic degree, household occupants, having a disease, teaching level, teaching mode, work hours, resilience, teaching anxiety and preparedness, and fear of being an asymptomatic patient (FOBAP), showed significant statistical correlation between each other (p<0.050) and to the 6 areas of the ACSS (danger, contamination, social economical, xenophobia, traumatic stress and compulsive checking). Our results further showed that the perceived stress and anxiety, fell into the category of absent to mild with only the danger section of the ACSS falling into the moderate category. Finally, resilience generated throughout the quarantine, seems to be a predictor of the adaptation the academic professional has undergone to cope with stress.