ARTICLE | doi:10.20944/preprints202004.0472.v1
Subject: Chemistry, Analytical Chemistry Keywords: 3-D printing; additive manufacturing; distributed manufacturing; laboratory equipment; open hardware; open source; open source hardware; scale; balance; mass
Online: 27 April 2020 (02:59:34 CEST)
This study provides designs for a low-cost, easily replicable open source lab-grade digital scale that can be used as a precision balance. The design is such that it can be manufactured for use in most labs throughout the world with open source RepRap-class material extrusion-based 3-D printers for the mechanical components and readily available open source electronics including the Arduino Nano. Several versions of the design were fabricated and tested for precision and accuracy for a range of load cells. The results showed the open source scale was found to be repeatable within 0.1g with multiple load cells, with even better precision (0.01g) depending on load cell range and style. The scale tracks linearly with proprietary lab-grade scales, meeting the performance specified in the load cell data sheets, indicating that it is accurate across the range of the load cell installed. The smallest loadcell tested(100g) offers precision on the order of a commercial digital mass balance. The scale can be produced at significant cost savings compared to scales of comparable range and precision when serial capability is present. The cost savings increase significantly as the range of the scale increases and are particularly well-suited for resource-constrained medical and scientific facilities.
REVIEW | doi:10.20944/preprints202004.0054.v1
Subject: Engineering, Industrial & Manufacturing Engineering Keywords: pandemic; influenza pandemic; open source; open hardware; COVID-19; COVID-19 pandemic; medical hardware; open source medicine
Online: 6 April 2020 (12:38:59 CEST)
Distributed digital manufacturing offers a solution to medical supply and technology shortages during pandemics. To prepare for the next pandemic, this study reviews the state-of-the-art for open hardware designs needed in a COVID-19-like pandemic. It evaluates the readiness of the top twenty technologies requested by the Government of India. The results show that the majority of the actual medical products have had some open source development, however, only 15% of the supporting technologies that make the open source device possible are freely available. The results show there is still considerable work needed to provide open source paths for the development of all the medical hardware needed during pandemics. Five core areas of future work are discussed that include: i) technical development of a wide-range of open source solutions for all medical supplies and devices, ii) policies that protect the productivity of laboratories, makerspaces and fabrication facilities during a pandemic, as well as iii) streamlining the regulatory process, iv) developing Good-Samaritan laws to protect makers and designers of open medical hardware, as well as to compel those with knowledge that will save lives to share it, and v) requiring all citizen-funded research to be released with free and open source licenses.
ARTICLE | doi:10.20944/preprints201904.0207.v1
Subject: Engineering, Biomedical & Chemical Engineering Keywords: 3-D printing; additive manufacturing; biomedical equipment; biomedical engineering; centrifuge; design; distributed manufacturing; laboratory equipment; open hardware; open source; open source hardware; medical equipment; medical instrumentation; scientific instrumentation
Online: 18 April 2019 (08:03:58 CEST)
Centrifuges are commonly required devices in medical diagnostics facilities as well as scientific laboratories. Although there are commercial and open source centrifuges, costs of the former and required electricity to operate the latter, limit accessibility in resource-constrained settings. There is a need for low-cost, human-powered, verified and reliable lab-scale centrifuge. This study provides the designs for a low-cost 100% 3-D printed centrifuge, which can be fabricated on any low-cost RepRap-class fused filament fabrication (FFF) or fused particle fabrication (FPF)-based 3-D printer. In addition, validation procedures are provided using a web camera and free and open source software. This paper provides the complete open source plans including instructions for fabrication and operation for a hand-powered centrifuge. This study successfully tested and validated the instrument, which can be operated anywhere in the world with no electricity inputs obtaining a radial velocity of over 1750rpm and over 50N of relative centrifugal force. Using commercial filament the instrument costs about US$25, which is less than half of all commercially available systems; however, the costs can be dropped further using recycled plastics on open source systems for over 99% savings. The results are discussed in the contexts of resource-constrained medical and scientific facilities.
Subject: Engineering, Industrial & Manufacturing Engineering Keywords: 3-D printing; additive manufacturing; distributed manufacturing; distributed recycling; granulator; shredder; open hardware; fab lab; open-source; polymers; recycling; waste plastic; extruder; upcycle; circular economy
Online: 1 September 2019 (08:25:03 CEST)
Abstract: In order to accelerate deployment of distributed recycling by providing low-cost feed stocks of granulated post-consumer waste plastic, this study analyzes an open source waste plastic granulator system. It is designed, built and tested for its ability to convert post-consumer waste, 3-D printed products and waste into polymer feedstock for recyclebots of fused particle/granule printers. The technical specifications of the device are quantified in terms of power consumption (380 to 404W for PET and PLA, respectively) and particle size distribution. The open source device can be fabricated for less than USD$2000 in materials. The experimentally-measured power use is only a minor contribution to the overall embodied energy of distributed recycling of waste plastic. The resultant plastic particle size distributions were found to be appropriate for use in both recyclebots and direct material extrusion 3-D printers. Simple retrofits are shown to reduce sound levels during operation by 4dB-5dB for the vacuum. These results indicate that the open source waste plastic granulator is an appropriate technology for community, library, makespace, fab lab or small business-based distributed recycling.
Subject: Engineering, Other Keywords: drying; materials processing; vacuum oven; small-scale; lab equipment; air-powered; open hard-ware; open source; digital manufacturing; dehydration
Online: 22 April 2021 (09:16:02 CEST)
Vacuum drying can dehydrate materials further than dry heat methods while protecting sensitive materials from thermal degradation. Many industries have shifted to vacuum drying as cost- or time-saving measures. Small-scale vacuum drying, however, has been limited by high costs of specialty scientific tools. To make vacuum drying more accessible, this study provides design and performance information for a small-scale open source vacuum oven, which can be fabricated from off-the-shelf and 3-D printed components. The oven is tested for drying speed and effective-ness on both waste plastic polyethylene terephthalate (PET) and a consortium of bacteria developed for bioprocessing of terephthalate wastes to assist in distributed recycling of PET for both additive manufacturing as well as potential food. Both materials can be damaged when exposed to high temperatures, making vacuum drying a desirable solution. The results showed the open source vacuum oven was effective at drying both plastic and biomaterials, drying at a higher rate than a hot-air dryer for small samples or for low volumes of water. The system can be constructed for less than 20% of commercial vacuum dryer costs for several laboratory-scale applications including dehydration of bio-organisms, drying plastic for distributed recycling and additive manufacturing, and chemical processing.
ARTICLE | doi:10.20944/preprints202006.0318.v1
Subject: Medicine & Pharmacology, General Medical Research Keywords: ventilator; pandemic; ventilation; influenza pandemic; coronavirus; coronavirus pandemic; pandemic ventilator; single-limb; open source; open hardware; COVID-19; medical hardware; RepRap; 3-D printing; open source medical hardware; embedded systems; real-time operating system
Online: 26 June 2020 (17:25:16 CEST)
This study describes the development of an automated bag valve mask (BVM) compression system, which, during acute shortages and supply chain disruptions can serve as a temporary emergency ventilator. The resuscitation system is based on the Arduino controller with a real-time operating system installed on a largely RepRap 3-D printable parametric component-based structure. The cost of the system is under $170, which makes it affordable for replication by makers around the world. The device provides a controlled breathing mode with tidal volumes from 100 to 800 milliliters, breathing rates from 5 to 40 breaths/minute, and inspiratory-to-expiratory ratio from 1:1 to 1:4. The system is designed for reliability and scalability of measurement circuits through the use of the serial peripheral interface and has the ability to connect additional hardware due to the object-oriented algorithmic approach. Experimental results demonstrate repeatability and accuracy exceeding human capabilities in BVM-based manual ventilation. Future work is necessary to further develop and test the system to make it acceptable for deployment outside of emergencies in clinical environments, however, the nature of the design is such that desired features are relatively easy to add with the test using protocols and parametric design files provided.
ARTICLE | doi:10.20944/preprints201905.0060.v1
Subject: Keywords: open source; 3D printing; Drosophila; laser cutter; lab equipment; open labware; fly-pushing; fly pad; fly plate; CO2 anesthesia
Online: 6 May 2019 (11:51:52 CEST)
One of the most important pieces of equipment used in labs in culturing populations of fruit flies (Drosophila sp.) is that of the “CO2 gas plate”, which is used to anesthesize individuals during “fly-pushing”. This piece of equipment consists of a box with a porous top into which carbon-dioxide is pumped. Flies placed on its surface are left immobilized, permitting the sorting, categorizing and/or counting of flies during population culturing and experimental assays. Unfortunately, commercially available gas plates are typically expensive. Here, we describe a new design for a gas plate that can be easily produced using a 3D printer and a laser cutter, which we are making freely available to the fly community.
ARTICLE | doi:10.20944/preprints202005.0479.v1
Subject: Engineering, Mechanical Engineering Keywords: open source; open hardware; COVID-19; medical hardware; RepRap; 3-D printing; open source medical hardware; high temperature 3-D printing; additive manufacturing; ULTEM; polycarbonate
Online: 31 May 2020 (16:18:20 CEST)
Thermal sterilization is generally avoided for 3-D printed components because of the relatively low deformation temperatures for common thermoplastics used for material extrusion-based additive manufacturing. 3-D printing materials required for high-temperature heat sterilizable components for COVID-19 and other applications demands 3-D printers with heated beds, hot ends that can reach higher temperatures than polytetrafluoroethylene (PTFE) hot ends and heated chambers to avoid part warping and delamination. There are several high temperature printers on the market, but their high costs make them inaccessible for full home-based distributed manufacturing required during pandemic lockdowns. To allow for all these requirements to be met for under $1,000, the Cerberus – an open source three-headed self-replicating rapid prototyper (RepRap) was designed and tested with the following capabilities: i) 200oC-capable heated bed, ii) 500oC-capabel hot end, iii) isolated heated chamber with 1kW space heater core and iv) mains voltage chamber and bed heating for rapid start. The Cereberus successfully prints polyetherketoneketone (PEKK) and polyetherimide (PEI, ULTEM) with tensile strengths of 77.5 and 80.5 MPa, respectively. As a case study, open source face masks were 3-D printed in PEKK and shown not to warp upon widely home-accessible oven-based sterilization.
Subject: Engineering, Energy & Fuel Technology Keywords: additive manufacturing; agriculture; agrivoltaic; distributed manufacturing; farming; gardening; open hardware; photovoltaic; recycling; solar energy
Online: 27 September 2021 (11:03:32 CEST)
There is an intense need to optimize agrivoltaic systems. This article describes the invention of a new testing system: the parametric open source cold-frame agrivoltaic system (POSCAS). POSCAS is an adapted gardening cold-frame used in cold climates as it acts as a small greenhouse for agricultural production. POSCAS is designed to test partially transparent solar photovoltaic (PV) modules targeting the agrivoltaic market. It can both function as a traditional cold frame, but it can also be automated to function as a full-service greenhouse. The integrated PV module roof can be used to power the controls or it can be attached to a microinverter to produce power. POSCAS can be placed in an experimental array for testing, agricultural and power production. It can be easily adapted for any type of partially transparent PV module. An array of POSCAS systems al-lows for testing agrivoltaic impacts from the percent transparency of the modules by varying the thickness of a thin film PV material or the density of silicon-based cells, and various forms of optical enhancement, anti-reflection coatings and solar light spectral shifting materials in the back sheet. All agrivoltaic variables can be customized to identify ideal PV designs for a given agricultural crop.
ARTICLE | doi:10.20944/preprints202010.0107.v1
Subject: Life Sciences, Biochemistry Keywords: Bioprinting; microextrusion; tissue engineering; bioink; open-source; stem cells
Online: 6 October 2020 (08:24:54 CEST)
Three-dimensional (3D) bioprinting promises to be essential in tissue engineering for solving the rising demand for organs and tissues. Some bioprinters are commercially available, but their impact on the field of TE is still limited due to their cost or difficulty to tune. Herein, we present a low-cost easy-to-build printhead for microextrusion-based bioprinting (MEBB) that can be installed in many desktop 3D printers to transform them into 3D bioprinters. We can extrude bioinks with precise control of print temperature between 2 - 60 ºC. We validated the versatility of the printhead, by assembling it in three low-cost open-source desktop 3D printers. Multiple units of the printhead can also be easily put together in a single printer carriage for building a multi-material 3D bioprinter. Print resolution was evaluated by creating representative calibration models at different temperatures using natural hydrogels such as gelatin and alginate, and synthetic ones like poloxamer. Using one of the three modified low-cost 3D printers, we successfully printed cell-laden lattice constructs with cell viabilities higher than 90% after 24h post printing. Controlling temperature and pressure according to the rheological properties of the bioinks was essential in achieving optimal printability and great cell viability. The cost per unit of our device, which can be used with syringes of different volume, is less expensive than any other commercially available product. These data demonstrate an affordable open-source printhead with the potential to become a reliable alternative to commercial bioprinters for any laboratory.
ARTICLE | doi:10.20944/preprints202209.0174.v1
Subject: Engineering, Civil Engineering Keywords: open-source; photovoltaic; mechanical design; electric vehicle; solar energy; solar carport; electric vehicle charging station
Online: 13 September 2022 (10:41:43 CEST)
Solar powering the increasing fleet of electrical vehicles (EV) demands more surface area than may be available for photovoltaic (PV) powered buildings. Parking lot solar canopies can provide the needed area to charge EVs, but are substantially costlier than roof- or ground-mounted PV systems. To provide a lower-cost PV parking lot canopy to supply EV charging beneath them, this study provides a full mechanical and economic analysis on three novel PV canopy systems: (1) exclusively wood, single parking spot spanning system, (2) wood and aluminum double parking spot spanning system, and (3) wood and aluminum cantilevered system for curbside parking. All systems can be scalable to any amount of EV parking spots. The complete designs and bill of materials (BOM) of the canopies are provided along with basic instructions and are released with an open source license that will enable anyone to fabricate them. The results found single-span systems have cost savings of 82%-85%, double-span systems save 43%-50%, and cantilevered systems save 31%-40%. In the first operation year, the PV canopies can provide 157% of energy needed to charge the least efficient EV currently on the market if it is driven the average driving distance in London ON, Canada.
ARTICLE | doi:10.20944/preprints202302.0056.v1
Subject: Life Sciences, Other Keywords: metabolomics; untargeted; mass-spectrometry; open-source; bioinformatics
Online: 3 February 2023 (04:16:00 CET)
Untargeted metabolomics is a powerful tool for measuring and understanding complex biological chemistries. However, employment, bioinformatics and downstream analysis of mass spectrometry (MS) data can be daunting for inexperienced users. Numerous open-source and free-to-use data processing and analysis tools exist for various untargeted MS approaches, including liquid chro-matography (LC), but choosing the ‘correct’ pipeline isn’t straight-forward. This tutorial, in con-junction with a user-friendly online guide presents a workflow for connecting these tools to process, analyse and annotate various untargeted MS datasets. The workflow is intended to guide explor-atory analysis in order to inform decision-making regarding costly and time-consuming down-stream targeted MS approaches. We provide practical advice concerning experimental design, organisation of data and downstream analysis, and offer details on sharing and storing valuable MS data for posterity. The workflow is editable and modular, allowing flexibility for updated/ changing methodologies and increased clarity and detail as user participation becomes more common. Hence, the authors welcome contributions and improvements to the workflow via the online repository. We believe that this workflow will streamline and condense complex mass-spectrometry approaches into easier, more manageable, analyses thereby generating opportunities for researchers previously discouraged by inaccessible and overly complicated software.
REVIEW | doi:10.20944/preprints202007.0153.v1
Online: 8 July 2020 (11:53:33 CEST)
Large datasets that enable researchers to perform investigations with unprecedented rigor are growing increasingly common in neuroimaging. Due to the simultaneous increasing popularity of open science, these state-of-the-art datasets are more accessible than ever to researchers around the world. While analysis of these samples has pushed the field forward, they pose a new set of challenges that might cause difficulties for novice users. Here, we offer practical tips for working with large datasets from the end-user’s perspective. We cover all aspects of the data life cycle: from what to consider when downloading and storing the data, to tips on how to become acquainted with a dataset one did not collect, to what to share when communicating results. This manuscript serves as a practical guide one can use when working with large neuroimaging datasets, thus dissolving barriers to scientific discovery.
ARTICLE | doi:10.20944/preprints202211.0093.v1
Subject: Mathematics & Computer Science, Information Technology & Data Management Keywords: automatic typesetting; media-neutral publishing; open access; open source; scholarly publishing; XML/HTML conversion
Online: 4 November 2022 (13:17:34 CET)
Due to resource constraints, most Diamond Open Access journals publish less than 25 articles per year, and 75% of journals are not able to provide their content in XML and HTML, primarily providing only PDFs (Bosman et al., 2021, p. 7-8). In order to keep up with larger commercial publishers, a high degree of automation and streamlining of processes is necessary. The Open Source Academic Publishing Suite (OS-APS) project funded by the German Federal Ministry of Education and Research aims to achieve this. OS-APS automatically extracts the underlying XML from Word manuscripts and offers optimization and export options in various formats (PDF, HTML, EPUB). The professional corporate design, e.g., of the PDFs, is managed automatically by using templates or creating one's own using a Template Development Kit. OS-APS will also connect to scholarly-led and community-driven publishing platforms such as Open Journal Systems (OJS), Open Monograph Press (OMP), and DSpace: the software will be able to be integrated into a wide range of publication processes, whether at small, low-resource commercial Open Access Publishers, or institutional and Diamond Open Access Publishers. References: Bosman, J., Frantsvåg, J. E., Kramer, B., Langlais, P.‑C., & Proudman, V. (2021). Oa Diamond Journals Study. Part 1: Findings. https://doi.org/10.5281/zenodo.4558703
ARTICLE | doi:10.20944/preprints201912.0063.v1
Subject: Engineering, Other Keywords: Software Quality Metrics; closed source software; open source software; Kahane’s Approach; UCP (Use Case Points) model and William’s Models
Online: 5 December 2019 (08:37:56 CET)
The complexity of software is increasing day by day due to the increase in the size of the projects being developed. For better planning and management of large software projects, estimation of software quality is important. During the development processes, complexity metrics are used for the indication of the attributes/characteristics of the quality software. There are many studies about the effect of the complexity of the software on the cost and quality. In this study, we discussed the effects of software complexity on the quality attributes of the software for open source and closed source software. Though, the quality metrics for open and closed source software are not distinct from each other. In this paper, we comparatively analyzed the impact of complexity metrics on open source and private software. We also presented various models for the management of the project complexity such as William’s Model, Stacey’s Agreement and Certainty matrix, Kahane’s Approach and UCP Model. Quality metrics here refer to the standards for the measurement of the quality of software which contains certain attributes or characteristics of the software that are related to the quality of the software. Certain quality attributes addressed in this study are Usability, Reliability, Security, Portability, Maintainability, Efficiency, Cost, Standards and Availability, etc. Both Open source and Closed source software are evaluated on the basis of these quality attributes. This study also recommended future approaches to manage the quality of project Open source and Closed source software and specify which one of them is mostly used in the industry.
ARTICLE | doi:10.20944/preprints201906.0251.v1
Subject: Physical Sciences, Applied Physics Keywords: video microscopy, imaging, automated data acquisition, nanoparticle tracking, measurement embedded applications, open-source software
Online: 25 June 2019 (12:53:50 CEST)
We introduce PyNTA, a modular instrumentation software for live particle tracking. By using the multiprocessing library of Python and the distributed messaging library pyZMQ, PyNTA allows users to acquire images from a camera at close to maximum readout bandwidth while simultaneously performing computations on each image on a separate processing unit. This publisher/subscriber pattern generates a small overhead and leverages the multi-core capabilities of modern computers. We demonstrate capabilities of the PyNTA package on the featured application of nanoparticle tracking analysis. Real-time particle tracking on megapixel images at a rate of 50 Hz is presented. Reliable live tracking reduces the required storage capacity for particle tracking measurements by a factor of approximately 103, as compared with raw data storage, allowing for a virtually unlimited duration of measurements
ARTICLE | doi:10.20944/preprints201902.0087.v1
Subject: Social Sciences, Geography Keywords: Noise mapping; END directive; GIS; open source; standards, road traffic; population exposure
Online: 11 February 2019 (09:48:53 CET)
The urbanisation phenomenon and related cities expansion and transport networks entail preventing the increase of population exposed to environmental pollution. Regarding noise exposure, the Environmental Noise Directive demands on main metropolis to produce noise maps. While based on standard methods, these latter are usually generated by proprietary software and require numerous input data concerning, for example, the buildings, land use, transportation network and traffic. The present work describes an open source implementation of a noise mapping tool fully implemented in a Geographic Information System compliant with the Open Geospatial Consortium standards. This integration makes easier at once the formatting and harvesting of noise model input data, cartographic rendering and output data linkage with population data. An application is given for a French city, which consists in estimating the impact of road traffic-related scenarios in terms of population exposure to noise levels both in relation to a threshold value and level classes.
ARTICLE | doi:10.20944/preprints202001.0080.v1
Subject: Engineering, Energy & Fuel Technology Keywords: shale gas; MRST; embedded discrete fracture model; open-source implementation
Online: 9 January 2020 (09:59:37 CET)
We present a generic and open-source framework for the numerical modeling of the expected transport and storage mechanisms in unconventional gas reservoirs. These unconventional reservoirs typically contain natural fractures at multiple scales. Considering the importance of these fractures in shale gas production, we perform a rigorous study on the accuracy of different fracture models. The framework is validated against an industrial simulator and is used to perform a history-matching study on the Barnett shale. This work presents an open-source code that leverages cutting-edge numerical modeling capabilities like automatic differentiation, stochastic fracture modeling, multi-continuum modeling and other explicit and discrete fracture models. We modified the conventional mass balance equation to account for the physical mechanisms that are unique to organic-rich source rocks. Some of these include the use of an adsorption isotherm, a dynamic permeability-correction function, and an embedded discrete fracture model (EDFM) with fracture-well connectivity. We explore the accuracy of the EDFM for modeling hydraulically-fractured shale-gas wells, which could be connected to natural fractures of finite or infinite conductivity, and could deform during production. Simulation results indicates that although the EDFM provides a computationally efficient model for describing flow in natural and hydraulic fractures, it could be inaccurate under these three conditions: 1. when the fracture conductivity is very low. 2. when the fractures are not orthogonal to the underlying Cartesian grid blocks, and 3. when sharp pressure drops occur in large grid blocks with insufficient mesh refinement. Each of these results are very significant considering that most of the fluids in these ultra-low matrix permeability reservoirs get produced through the interconnected natural fractures, which are expected to have very low fracture conductivities. We also expect sharp pressure drops near the fractures in these shale gas reservoirs, and it is very unrealistic to expect the hydraulic fractures or complex fracture networks to be orthogonal to any structured grid. In conclusion, this paper presents an open-source numerical framework to facilitate the modeling of the expected physical mechanisms in shale-gas reservoirs. The code was validated against published results and a commercial simulator. We also performed a history-matching study on a naturally-fractured Barnett shale-gas well considering adsorption, gas slippage & diffusion and fracture closure as well as proppant embedment, using the framework presented. This work provides the first open-source code that can be used to facilitate the modeling and optimization of fractured shale-gas reservoirs. To provide the numerical flexibility to accurately model stochastic natural fractures that are connected to hydraulically-fractured wells, it is built atop other related open-source codes. We also present the first rigorous study on the accuracy of using EDFM to model both hydraulic fractures and natural fractures that may or may not be interconnected.
ARTICLE | doi:10.20944/preprints201808.0233.v1
Subject: Engineering, Electrical & Electronic Engineering Keywords: 3-D printing; circuit milling; circuit design; distributed manufacturing; electronics; electronics prototyping; free and open-source hardware; P2P; P2P manufacturing
Online: 13 August 2018 (16:42:54 CEST)
Barriers to inventing electronic devices involve challenges of iterating electronic designs due to long lead times for professional circuit board milling or high-costs of commercial milling machines. To overcome these barriers this study provides open source (OS) designs for a low-cost circuit milling machine. First, design modifications for mechanical and electrical sub-systems of the OS D3D Robotics prototyping system are provided. Next, Copper Carve, an OS custom graphical user interface, is developed to enable circuit board milling by implementing backlash and substrate distortion compensation. The performance of the OS D3D circuit mill is then quantified and validated for: positional accuracy, cut quality, feature accuracy and distortion compensation. Finally, the return on investment is calculated for inventors using it. The results show by properly compensating for motion inaccuracies with Copper Carve, the machine achieves a motion resolution of 10 microns, which is more than adequate for most circuit designs. The mill is at least five times less expensive than all commercial alternatives and the material costs of the D3D mill are repaid from fabricating 20-43 boards. The results show that the OS circuit mill is of high-enough quality to enable rapid invention and distributed manufacturing of complex products containing custom electronics.
ARTICLE | doi:10.20944/preprints202010.0132.v1
Subject: Engineering, Energy & Fuel Technology Keywords: sector coupling; gas grid; district heating grid; grid simulation; network analysis; grid operation; open source; multi-energy grids; energy supply; infrastructure design
Online: 6 October 2020 (14:48:08 CEST)
The increasing complexity of the design and operation evaluation process of multi-energy grids (MEGs) requires tools for the coupled simulation of power, gas and district heating grids. Most tools analyzed in this paper either do not allow coupling of infrastructures, simplify the grid model or are not publicly available. We introduce the open source piping grid simulation tool pandapipes that – in interaction with pandapower - fulfills three crucial criteria: clear data structure, adaptable MEG model setup and performance. In an introduction to pandapipes we illustrate how it fulfills these criteria through its internal structure and demonstrate how it performs in comparison to STANET®. Then we show two case studies that have been performed with pandapipes already. The first case study demonstrates a peak shaving strategy as interaction of a local electricity and district heating grid in a small settlement. The second case study analyzes the potential of a power-to-gas device to serve as flexibility in a power grid under consideration of gas grid constraints. They both show the importance of a clear database, a simple simulation setup and good performance to set up different large and complex studies on grid infrastructure design and operation.
REVIEW | doi:10.20944/preprints201905.0302.v1
Subject: Social Sciences, Economics Keywords: open science; open access; open data; economic impacts
Online: 27 May 2019 (11:19:59 CEST)
A common motivation for increasing open access to research findings and data is the potential to create economic benefits – but evidence is patchy and diverse. This study systematically reviewed the evidence on what kinds of economic impacts (positive and negative) open science can have, how these comes about, and how benefits could be maximized. Use of open science outputs often leaves no obvious trace, so most evidence of impacts is based on interviews, surveys, inference based on existing costs, and modelling approaches. There is indicative evidence that open access to findings/data can lead to savings in access costs, labour costs and transaction costs. There are examples of open science enabling new products, services, companies, research and collaborations. Modelling studies suggest higher returns to R&D if open access permits greater accessibility and efficiency of use of findings. Barriers include lack of skills capacity in search, interpretation and text mining, and lack of clarity around where benefits accrue. There are also contextual considerations around who benefits most from open science (e.g. sectors, small vs larger companies, types of dataset). Recommendations captured in the review include more research, monitoring and evaluation (including developing metrics), promoting benefits, capacity building and making outputs more audience-friendly.
ARTICLE | doi:10.20944/preprints202005.0310.v1
Subject: Medicine & Pharmacology, Pathology & Pathobiology Keywords: open hardware; COVID-19; medical hardware; nasopharyngeal swab; nasal swab; UV curing; SLA; RepRap; 3-D printing; additive manufacturing
Online: 19 May 2020 (04:21:41 CEST)
Access to nasopharyngeal swabs for sampling remain a bottleneck in some regions for COVID-19 testing. This study develops a distributed manufacturing solution using only an open source manufacturing tool chain consisting of two types of open source 3-D printing and batch UV curing, and provides a parametric fully free design of a nasopharyngeal swab. The swab was designed using parametric OpenSCAD in two components (a head with engineered break point and various handles), which has several advantages: i) minimizing print time on relatively slow SLA printers, ii) enabling the use of smaller print volume open source SLA printers, iii) reducing the amount of relatively expensive UV resin, and iv) enabling production of handle on more accessible material extrusion 3-D printers. A modular open source UV LED box was designed, fabricated for $45 and tested for batch curing. Swabs can be fabricated for $0.06-$0.12/swab. The results of the mechanical validation tests showed that the swabs could withstand greater forces than would be expected in normal clinical use. The swabs were also able to absorb a significant amounts of synthetic mucus materials and passed abrasion and handling tests. The results show the open source swab are promising candidates for clinical trials.
ARTICLE | doi:10.20944/preprints201805.0284.v1
Subject: Engineering, Electrical & Electronic Engineering Keywords: dual two-level voltage source inverter; common-mode voltage; discontinuous space vector modulation schemes; centralizing pulse width modulation; open-end load
Online: 22 May 2018 (05:11:06 CEST)
The popular motor drive systems with a single two-level voltage source inverter (VSI) have one main problem that is the occurrence of the common-mode voltage (CMV), which is an effect of the electromagnetic interference, shaft voltage, bearing currents, leakage current. These cause the high stress, increasung temperature and early mechanical failure in machine. To overcome this problem, the technology of the dual two-level VSI fed open-end three-phase ac loads is now available to eliminate the CMV at the ac/induction motor load with the 120-degree modulation technique for controlling each inverter. In this paper, the discontinuous space vector modulation (DSVM) schemes are proposed and applied for the dual two-level VSI fed open-end load. It is based on the 120-degree modulation technique by using only 12 active voltage vectors and the 10 zero voltage vectors from the total 64 voltage vectors along with the different five-segment swicthing sequence designs with centralizing pulse width modulation technqiue in order to not only cancel the CMV in the ac load, but also reduce the switching number/switching loss of the conversion system. Among the various DSVM schemes, their performances are compared in this paper, such as the number of the switching, the step and peak value of the CMV in each inverter, and the quality of the output waveform, etc. The details of the verfication and comparison are carried out by simulation using Matlab/Simulink software.
Subject: Social Sciences, Library & Information Science Keywords: bioeconomy; open science; open access
Online: 30 October 2020 (14:45:27 CET)
The purpose of this paper is to assess the degree of openness of scientific articles on bioeconomy. Based on a WoS corpus of 2,489 articles published between 2015 and 2019, we calculated bibliometric indicators, explored the openness of each paper and assessed the share of journals, countries and research areas of these articles. The results show a sharp increase and diversification of articles in the field of bioeconomy, with a beginning long tail distribution. 45.6% of the articles are freely available, and the share of OA papers is steadily increasing, from 31% in 2015 to 52% in 2019. Gold is the most important variant of OA. Open access is low in the applied research areas of chemical, agricultural and environmental engineering but higher in the domains of energy and fuels, forestry, and green and sustainable science and technology. The UK and the Netherlands have the highest rates of OA papers, followed by Spain and Germany. The funding rate of OA papers is higher than of non-OA papers. This is the first bibliometric study on open access to articles on bioeconomy. The results can be useful for the further development of OA editorial and funding criteria in the field of bioeconomy.
ARTICLE | doi:10.20944/preprints202006.0207.v1
Subject: Keywords: open hardware; COVID-19; medical hardware; Powered Air-Purifying Respirator; PAPR; RepRap; 3-D printing; additive manufacturing; personal protective equipment; safety equipment
Online: 17 June 2020 (03:50:13 CEST)
To assist firefighters and other first responders to use their existing equipment for respiration during the COVID-19 pandemic without using single-use, low-supply, masks, this study outlines an open source kit to convert a 3M-manufactured Scott Safety self-contained breathing apparatus (SCBA) into a powered air-purifying particulate respirator (PAPR). The open source PAPR can be fabricated with a low-cost 3-D printer and widely available components for less than $150, replacing commercial conversion kits saving 85% or full-fledged proprietary PAPRs saving over 90%. The parametric designs allow for adaptation to other core components and can be custom fit specifically to fire-fighter equipment, including their suspenders. The open source PAPR has controllable air flow and its design enables breathing even if the fan is disconnected or if the battery dies. The open source PAPR was tested for air flow as a function of battery life and was found to meet NIOSH air flow requirements for 4 hours, which is 300% over expected regular use.
CASE REPORT | doi:10.20944/preprints201905.0166.v1
Subject: Social Sciences, Library & Information Science Keywords: Open Annotation; Monographs; Open Access; Higher Education; Open Peer Review
Online: 14 May 2019 (10:03:41 CEST)
The digital format opens up new possibilities for interaction with monographic publications. In particular, annotation tools make it possible to broaden the discussion on the content of a book, to suggest new ideas, to report errors or inaccuracies, and to conduct open peer reviews. However, this requires the support of the users who might not yet be familiar with the annotation of digital documents. This paper will give concrete examples and recommendations for exploiting the potential of annotation in academic research and teaching. After presenting the annotation tool of Hypothesis, the article focuses on its use in the context of HIRMEOS (High Integration of Research Monographs in the European Open Science Infrastructure), a project aimed to improve the Open Access digital monograph. The general line and the aims of a post-peer review experiment with the annotation tool, as well as its usage in didactic activities concerning monographic publications are presented and proposed as potential best practices for similar annotation activities.
ARTICLE | doi:10.20944/preprints202011.0282.v1
Subject: Social Sciences, Accounting Keywords: Open Research Data; Open Peer Review; medicine; health sciences; Open Science; Open Access; health scientists; FAIR
Online: 9 November 2020 (16:02:24 CET)
During the last years, significant initiatives have been launched for the dissemination of Open Access as part of the Open Science movement. Nevertheless, the other major pillars of Open Science such as Open Research Data (ORD) and Open Peer Review (OPR) are still in an early stage of development among the communities of researchers and stakeholders. The present study sought to unveil the perceptions of a medical and health sciences community about these issues. Through the investigation of researchers’ attitude, valuable conclusions can be drawn, especially in the field of medicine and health sciences, where an explosive growth of scientific publishing exists. A quantitative survey was conducted based on a structured questionnaire, with 51.8% response rate (215 responses out of 415 electronic invitations). The participants in the survey agreed with the ORD principles However they ignored basic terms like FAIR (Findable, Accessible, Interoperable and Reusable) and appeared incentive to permit the exploitation of their data. Regarding OPR, participants expressed their agreement, implying their interest for a trustworthy evaluation system. Conclusively, researchers urge to receive proper training for both ORD principles and OPR processes which combined with a reformed evaluation system will enable them to take full advantage of the opportunities that arise from the new scholar publishing and communication landscape.
EDITORIAL | doi:10.20944/preprints201605.0001.v1
Subject: Keywords: Preprints, Open Science
Online: 3 May 2016 (14:43:02 CEST)
Preprints is a multidisciplinary preprint platform that makes scientific manuscripts from all fields of research immediately available at www.preprints.org. Preprints is a free (not-for-profit) open access service supported by MDPI in Basel, Switzerland.
ARTICLE | doi:10.20944/preprints201909.0122.v1
Subject: Medicine & Pharmacology, Other Keywords: open health; simple rules; ethics; reproducibility; research significance; open science
Online: 11 September 2019 (13:27:26 CEST)
We are witnessing a dramatic transformation in the way we do science. In recent years, significant flaws with existing scientific methods have come to light, including lack of transparency, insufficient involvement of stakeholders, disconnection from the public, and limited reproducibility of research findings. These concerns have sparked a global movement to revolutionize scientific practice and the emergence of Open Science. This new approach to science extends principles of openness to the entire research cycle, from hypothesis generation to data collection, analysis, replication, and translation from research to practice. Open Science seeks to remove all barriers to conducting high quality, rigorous, and impactful scientific research by ensuring that the data, methods, and opportunities for collaboration are open to all. Emerging digital technologies and "big data" (see "Ten simple rules for responsible big data research") have further accelerated the Open Science movement by affording new approaches to data sharing, connecting researcher networks, and facilitating the dissemination of research findings. Open scientific practices are also having a profound impact on the health sciences and medical research, and specifically how we conduct clinical research with human participants. Human health research necessitates careful considerations for practicing science in an ethical manner. There is also a particular urgency to human health research since the goal is to help people, so doing good science takes on a different meaning than simply doing science well. It also implores the scientist to reassess the conventional view of human health research as a pursuit conducted by scientists on human subjects, and lays a greater emphasis on inclusive and ethical practices to ensure that the research takes into account the interests of those who would be most impacted by the research. Openness in the context of human health research also raises greater concerns about privacy and security and presents more opportunities for people, including participants of research studies, to contribute in every capacity. At the core of open health research, scientific discoveries are not only the product of collaboration across disciplines, but must also be owned by the community that is inclusive of researchers, health workers, and patients and their families. To guide successful open health research practices, it is essential to carefully consider and delineate its guiding principles. This editorial is aimed at individuals participating in health science in any capacity, including but not limited to people living with medical conditions, health professionals, study participants, and researchers spanning all types of disciplines. We present ten simple rules that, while not comprehensive, offer guidance for conducting health research with human participants in an open, ethical, and rigorous manner. These rules can be difficult, resource-intensive, and can conflict with one another. They are aspirational and are intended to accelerate and improve the quality of human health research. Work that fails to follow these rules is not necessarily an indication of poor quality research, especially if the reasons for breaking the rules are considered and articulated (see rule 6: document everything). While most of the responsibility of following these rules falls on researchers, anyone involved in human health research in any capacity can apply them.
ARTICLE | doi:10.20944/preprints201905.0098.v2
Subject: Mathematics & Computer Science, Other Keywords: open review; open science; zero-blind review; peer review; methodology
Online: 16 August 2019 (05:27:55 CEST)
We present a discussion and analysis regarding the benefits and limitations of open and non-anonymized peer review based on literature results and responses to a survey on the reviewing process of alt.chi, a more or less open-review track within the CHI conference, the predominant conference in the field of human-computer interaction (HCI). This track currently is the only implementation of an open-peer-review process in the field of HCI while, with the recent increase in interest in open science practices, open review is now being considered and used in other fields. We collected 30 responses from alt.chi authors and reviewers and found that, while the benefits are quite clear and the system is generally well liked by alt.chi participants, they are reluctant to see it used in other venues. This concurs with a number of recent studies that suggest a divergence between support for a more open review process and its practical implementation. The data and scripts are available on https://osf.io/vuw7h/, and the figures and follow-up work on http://tiny.cc/OpenReviews.
ARTICLE | doi:10.20944/preprints202202.0088.v1
Subject: Medicine & Pharmacology, Sport Sciences & Therapy Keywords: Public open spaces; Open streets; Built environment; Leisure-time physical activity; Epidemiology
Online: 7 February 2022 (13:02:09 CET)
Leisure-time physical activity (LTPA) is associated with access and use of public open spaces. The “President João Goulart Elevated Avenue” and current denominated “Minhocão” is a facility for leisure activities that are open for people during the night/weekends. The aim of this study was to examine if the prevalence of LTPA among individuals living in the surroundings of Minhocão is different according to proximity to, and use of, the facility. We conducted a cross-sectional study with cluster sampling with people aged ≥18 years who lived in households until 500m and between 501m to 1500m of Minhocão. The survey was conducted between December/2017 until March/2019 with an electronic questionnaire self-responded. We conducted bivariate analysis and Poisson regression to examine possible differences in LTPA according to the proximity of residences and use of Minhocão. The analysis used post-stratification weights. A total of 12,030 telephone numbers of people were drawn (≤500m = 6,942; and >500m to ≤1500m = 5,088). The final sample analyzed were of 235 residents who returned the questionnaires. There was a higher prevalence of individuals engaging in at least 150 minutes per week of LTPA among users than non-users (Prevalence Ratio=2.23, IC95%1.72-2.90). People who used the park had higher prevalence of all types of LTPA than non-users. The results can serve to inform government decision-making on the future of Minhocão.
ARTICLE | doi:10.20944/preprints202001.0240.v1
Subject: Social Sciences, Library & Information Science Keywords: openness under neoliberalism; open-access licensing in capitalism; the politics of open-licensing
Online: 21 January 2020 (11:00:41 CET)
The terms 'open' and 'openness' are widely used across the current higher education environment particularly in the areas of repository services and scholarly communications. Open-access licensing and open-source licensing are two prevalent manifestations of open culture within higher education research environments. As theoretical ideals, open-licensing models aim at openness and academic freedom. But operating as they do within the context of global neoliberalism, to what extent are these models constructed by, sustained by, and co-opted by neoliberalism? In this paper, we interrogate the use of open-licensing within scholarly communications and within the larger societal context of neoliberalism. Through synthesis of various sources, we will examine how open access licensing models have been constrained by neoliberal or otherwise corporate agendas, how open access and open scholarship have been reframed within discourses of compliance, how open-source software models and software are co-opted by politico-economic forces, and how the language of 'openness' is widely misused in higher education and repository services circles to drive agendas that run counter to actually increasing openness. We will finish by suggesting ways to resist this trend and use open-licensing models to resist neoliberal agendas in open scholarship.
ARTICLE | doi:10.20944/preprints201708.0069.v1
Subject: Mathematics & Computer Science, Other Keywords: energy system analysis; model challenges; open science; open source; energy modelling framework; oemof
Online: 21 August 2017 (03:02:34 CEST)
The research field of energy system analysis is dealing with increasingly complex energy systems and their respective challenges. Moreover, the requirement for open science has become a focal point of public interest. Both drivers have triggered the development of a broad range of (open) energy models and frameworks in recent years. However, there are hardly any approaches on how to evaluate these tools in terms of their capabilities to tackle energy system modelling challenges. This paper describes a first step towards a flexible evaluation of software to model energy systems. We propose a qualitative approach as an useful supplementary to existing model fact sheets and transparency checklists. We demonstrate the applicability by evaluating the newly developed “Open Energy Modelling Framework” with respect to existing challenges in energy system modelling. The case study results highlight that challenges related to complexity and scientific standards can be tackled to a large extent while the challenges of model utilization and interdisciplinary modelling are only tackled partially. However, the challenge of uncertainty remains for the most part unaddressed at present. Advantages of the evaluation approach lie in its simplicity, flexibility and transferability to other tools. Disadvantages mostly stem from its qualitative nature. Our analysis reveals that some challenges in the field of energy system modelling cannot be addressed by a software as they are on meta level like model result communication and interdisciplinary modelling.
ARTICLE | doi:10.20944/preprints202112.0099.v1
Online: 7 December 2021 (11:30:56 CET)
Forest recreation can be successfully used for the psychological relaxation of respondents and can be used as a remedy for common problems with stress. The special form of forest recreation intended for restoration is forest bathing. These activities might be distracted by some factors, such as viewing buildings in the forest or using a computer in nature, which interrupt psychological relaxation. One factor that might interrupt psychological relaxation is the occurrence of an open dump in the forest during an outdoor experience. To test the hypothesis that an open dump might decrease psychological relaxation, a case study was planned that used a randomized, controlled crossover design. For this purpose, two groups of healthy young adults viewed a control forest or a forest with an open dump in reverse order and filled in psychological questionnaires after each stimulus. A pretest was used. Participants wore oblique eye patches to stop their visual stimulation before the experimental stimulation, and the physical environment was monitored. The results were analyzed using the two-way repeated measures ANOVA. The measured negative psychological indicators significantly increased after viewing the forest with waste, and the five indicators of the Profile of Mood States increased: Tension-Anxiety, Depression-Dejection, Anger-Hostility, Fatigue, and Confusion. In addition, the negative aspect of the Positive and Negative Affect Schedule increased in comparison to the control and pretest. The measured positive indicators significantly decreased after viewing the forest with waste, the positive aspect of the Positive and Negative Affect Schedule decreased, and the Restorative Outcome Scale and Subjective Vitality scores decreased (in comparison to the control and pretest). The occurrence of an open dump in the forest might interrupt a normal restorative experience in the forest by reducing psychological relaxation. Nevertheless, the mechanism of these relevancies is not known, and thus, it will be further investigated. In addition, in a future study, the size of the impact of these open dumps on normal everyday experiences should be investigated. It is proposed that different mechanisms might be responsible for these reactions; however, the aim of this manuscript is to only measure this reaction. The identified psychological reasons for these mechanisms can be assessed in further studies.
ARTICLE | doi:10.20944/preprints201901.0238.v1
Online: 23 January 2019 (10:15:00 CET)
In December 2012, DOAJ’s parent company, IS4OA, announced they would introduce new criteria for inclusion in DOAJ  and that DOAJ would collect vastly more information from journals as part of the accreditation process – and that journals already included, would need to reapply in order to be kept in the registry. My hypothesis was that the journals removed from DOAJ on May 9th 2016 would chiefly be journals from small publishers (mostly single journal publishers) and that DOAJ journal metadata information would reveal that they were journals with a lower level of publishing competence than those that would remain in the DOAJ. Among indicators of publishing competence could be the use of APCs, permanent article identifiers, journal licenses, article level metadata deposited with DOAJ, archiving policy/solutions and/or having a policy in SHERPA/RoMEO. The analysis shows my concerns to be correct.
CONCEPT PAPER | doi:10.20944/preprints201707.0095.v1
Online: 31 July 2017 (16:05:17 CEST)
Chemistry is the last natural science discipline to embrace prepublishing, namely the publication of non-peer reviewed scientific articles on the internet. After a brief insight into the origins and the purpose of prepublishing in science, we conduct a concrete analysis of the concrete situation, aiming at providing an answer to several questions. Why the chemistry community has been late in embracing prepublishing? Is this in relation with the slow acceptance of open access publishing by the same community? Will prepublishing become a common habit also for chemistry scholars?
ARTICLE | doi:10.20944/preprints202002.0165.v1
Subject: Social Sciences, Library & Information Science Keywords: open access; api; self archiving,; automation
Online: 13 February 2020 (10:34:30 CET)
This proposal describes the design and development of an interoperable application that supports green open access with long-term sustainability and improved user experience of article deposit. Introduction: The lack of library resources and unfriendly repository user interface are two significant barriers that hinder green open access. Tasked to implement the open access mandate, librarians at an American research university developed a comprehensive system called Easy Deposit 2 to automate the support workflow of green open access. Implementation: Easy Deposit 2 is a web application that is able to harvest newly publications, outreach for manuscript on behalf of the library, and facilitate self-archiving to IR. It is developed and maintained by the library and integrated with the IR. Results and Discussion: The article deposit rate is about 25% with Easy Deposit 2, which increases significantly comparing to the previous period. It also serves as a local database for faculty publications with open access status. The lesson learned is that library cannot rely on a single commercial provider for publication data due to mismatched priorities. Conclusion: Recent IT developments provides new opportunities of innovation like Easy Deposit 2 in supporting open access. Academic librarians are vital in promoting "openness" in scholarly communication such as transparency and diversity in the sharing of publication data.
ARTICLE | doi:10.20944/preprints201905.0029.v1
Subject: Mathematics & Computer Science, Information Technology & Data Management Keywords: multilingual; open information extraction; parallel corpus
Online: 6 May 2019 (06:14:07 CEST)
The number of documents published on the Web other languages than English grows every year. As a consequence, it increases the necessity of extracting useful information from different languages, pointing out the importance of researching Open Information Extraction (OIE) techniques. Different OIE methods have been dealing with features from a unique language. On the other hand, few approaches tackle multilingual aspects. In such approaches, multilingual is only treated as an extraction method, which results in low precision due to the use of general rules. Multilingual methods have been applied to a vast amount of problems in Natural Language Processing achieving satisfactory results and demonstrating that knowledge acquisition for a language can be transferred to other languages to improve the quality of the facts extracted. We state that a multilingual approach can enhance OIE methods, being ideal to evaluate and compare OIE systems, and as a consequence, to applying it to the collected facts. In this work, we discuss how the transfer knowledge between languages can increase the acquisition from multilingual approaches. We provide a roadmap of the Multilingual Open IE area concerning the state of the art studies. Additionally, we evaluate the transfer of knowledge to improve the quality of the facts extracted in each language. Moreover, we discuss the importance of a parallel corpus to evaluate and compare multilingual systems.
ARTICLE | doi:10.20944/preprints201809.0017.v1
Online: 3 September 2018 (09:39:24 CEST)
Universities, like cities, have embraced novel technologies and data-based solutions to improve their campuses with ‘smart’ becoming a welcomed concept. Campuses in many ways are small-scale cities. They increasingly seek to address similar challenges and to deliver improved experiences to their users. How can data be used in making this vision a reality? What can we learn from smart campuses that can be scale up to smart cities? A short research study was conducted over a three-month period at a public university in the United Kingdom employing stakeholder interviews and user surveys, aiming at gaining insight into these questions. Based on the study, the authors suggest that making data publicly available could bring many benefits to different groups of stakeholders and campus users. These benefits come with risks and challenges such as data privacy and protection and infrastructure hurdles. However, if these challenges can be overcome, open data could contribute significantly to improving campuses and user experiences, and potentially set an example for smart cities.
ARTICLE | doi:10.20944/preprints201806.0243.v1
Online: 15 June 2018 (05:19:00 CEST)
This paper explores whether preprints can better support open science by providing links to other early-stage research outputs. This potentially has benefits for transparency and discoverability of research projects. By looking at preprint submission systems, online preprints and surveying those who run preprint servers, I examined to what extent this is currently possible. No preprints server provided a complete service, however many allowed the linking of several open science elements from the abstract page. I looked at variation based on subject, age, and size of preprint server. In conclusion, authors posting preprints should consider the options provided by different preprint servers. It appears that open science is just one focus of preprint servers and further improvements will be dependent on preprint server policies and priorities rather than overcoming any technical difficulties.
CASE REPORT | doi:10.20944/preprints201801.0066.v1
Online: 8 January 2018 (11:11:47 CET)
The implementation of the European Cohesion Policy aiming at fostering regions competitiveness, economic growth and creation of new jobs is documented over the period 2014–2020 in the publicly available Open Data Portal for the European Structural and Investment funds. On the base of this source, this paper aims at describing the process of data mining and visualization for information production on regional programmes performace in achieving effective expenditure of resouces.
REVIEW | doi:10.20944/preprints202301.0349.v2
Subject: Physical Sciences, General & Theoretical Physics Keywords: Open systems; irreversibility; Second Law of Thermodynamics
Online: 3 February 2023 (02:47:11 CET)
There is great current interest in systems represented by non-Hermitian Hamiltonians, including a wide variety of real systems that may be dissipative and whose behaviour can be represented by a "phase" parameter that characterises the way "exceptional points" (singularities of various sorts) determine the system. These systems are briefly reviewed with an emphasis on their geometrical thermodynamics properties.
REVIEW | doi:10.20944/preprints202301.0126.v1
Subject: Physical Sciences, General & Theoretical Physics Keywords: Open systems; irreversibility; Second Law of Thermodynamics
Online: 6 January 2023 (11:37:13 CET)
There is great current interest in systems represented by non-Hermitian Hamiltonians, including a wide variety of real systems that may be dissipative and whose behaviour can be represented by a "phase" parameter that characterises the way "exceptional points" (singularities of various sorts) determine the system. These systems are briefly reviewed with an emphasis on their geometrical thermodynamics properties.
REVIEW | doi:10.20944/preprints202110.0390.v1
Subject: Life Sciences, Biotechnology Keywords: Biological contaminants; grazers; microalgae; open cultivation; biopesticides
Online: 26 October 2021 (14:36:19 CEST)
Microalgae biomass is a budding raw material for the origination of food, fuel, and other value-added products. However, bulk production of microalgal biomass at commercial level is a herculean task for the current microalgal mass production technologies due to the undesirable contaminations by biological pollutants. These contaminants hamstring the production of microalgae biomass by debilitating the growth of cultures, crumble the quality of biomass and sometimes may crash the whole culture. The best utilization of the microalgae biomass at industrial level could be attained by avoiding various possible biological contaminations in mass cultivation system, understanding the contamination mechanisms, and the complex interactions of algae with other microorganisms. This review explores the various types of biological pollutants, their possible mode of infection along with mechanisms, different controlling methods to maintain desired microalgae culture.
ARTICLE | doi:10.20944/preprints202106.0648.v1
Subject: Social Sciences, Accounting Keywords: open access; article processing charges; monitoring systems
Online: 28 June 2021 (12:33:05 CEST)
The Open Access (OA) publishing model that is based on article processing charges (APC) is often associated with the potential for more transparency regarding the expenditures for publications. However, the extent to which transparency can be achieved depends not least on the completeness of data in APC monitoring systems. This article investigates two blind spots of the largest collection of APC payment information, OpenAPC. It aims to identify likely APC-liable publications for German universities that contribute to this system and for those that do not provide data to it. The calculation combines data from Web of Science, the ISSN-Gold-OA-list and OpenAPC. The results show that for the group of universities contributing to the monitoring system, more than half of the APC payments are not covered by it and the average payments for non-covered APCs is higher than for APCs covered by the system. In addition, the group of universities that does not contribute to OpenAPC accounts for two thirds of the number of APC-liable publications recorded for contributing universities. Regarding the size of these blind spots, the value of the monitoring system is limited at present.
ARTICLE | doi:10.20944/preprints202101.0082.v2
Subject: Earth Sciences, Atmospheric Science Keywords: Shoreline Evolution; Open-Source Software; GIS; Modeling
Online: 19 February 2021 (09:46:48 CET)
This paper presents the validation of the End Point Rate (EPR) tool for QGIS (EPR4Q), a tool built-in QGIS Graphical Modeler to calculate the shoreline change by End Point Rate method. The EPR4Q tries to fill the gap of user-friendly and free open-source tool for shoreline analysis in Geographic Information System environment, since the most used software - Digital Shoreline Analysis System (DSAS) - although is a free extension, is suited for commercial software. Besides, the best free open-source option to calculate EPR called Analyzing Moving Boundaries Using R (AMBUR), since it is a robust and powerful tool, the complexity and heavy processes can restrict the accessibility and simple usage. The validation methodology consists of applying the EPR4Q, DSAS, and AMBUR on different examples of shorelines found in nature, extracted from the U.S. Geological Survey Open-File. The obtained results of each tool were compared with Pearson correlation coefficient. The validation results indicate that the EPR4Q tool created acquired high correlation values with DSAS and AMBUR, reaching a coefficient of 0.98 to 1.00 on linear, extensive, and non-extensive shorelines, guarantying that the EPR4Q tool is ready to be freely used by the academic, scientific, engineering, and coastal managers communities worldwide.
ARTICLE | doi:10.20944/preprints202003.0443.v2
Subject: Social Sciences, Library & Information Science Keywords: COVID-19; open science; data; bibliometric; pandemic
Online: 22 April 2020 (06:15:34 CEST)
Introduction: The Pandemic of COVID-19, an infectious disease caused by SARS-CoV-2 motivated the scientific community to work together in order to gather, organize, process and distribute data on the novel biomedical hazard. Here, we analyzed how the scientific community responded to this challenge by quantifying distribution and availability patterns of the academic information related to COVID-19. The aim of our study was to assess the quality of the information flow and scientific collaboration, two factors we believe to be critical for finding new solutions for the ongoing pandemic. Materials and methods: The RISmed R package, and a custom Python script were used to fetch metadata on articles indexed in PubMed and published on Rxiv preprint server. Scopus was manually searched and the metadata was exported in BibTex file. Publication rate and publication status, affiliation and author count per article, and submission-to-publication time were analysed in R. Biblioshiny application was used to create a world collaboration map. Results: Our preliminary data suggest that COVID-19 pandemic resulted in generation of a large amount of scientific data, and demonstrates potential problems regarding the information velocity, availability, and scientific collaboration in the early stages of the pandemic. More specifically, our results indicate precarious overload of the standard publication systems, significant problems with data availability and apparent deficient collaboration. Conclusion: In conclusion, we believe the scientific community could have used the data more efficiently in order to create proper foundations for finding new solutions for the COVID-19 pandemic. Moreover, we believe we can learn from this on the go and adopt open science principles and a more mindful approach to COVID-19-related data to accelerate the discovery of more efficient solutions. We take this opportunity to invite our colleagues to contribute to this global scientific collaboration by publishing their findings with maximal transparency.
ARTICLE | doi:10.20944/preprints201912.0312.v1
Subject: Engineering, Civil Engineering Keywords: thermal comfort; draught; cooling period; open office
Online: 24 December 2019 (08:42:03 CET)
Local thermal comfort (TC) and draught rate (DR) has been studied widely. There has been more meaningful research performed in controlled boundary condition situations than in actual work environments involving occupants. TC conditions in office buildings in Estonia have been barely investigated in the past. In this paper, the results of TC and DR assessment in five office buildings in Tallinn are presented and discussed. Studied office landscapes vary in heating, ventilation and cooling (HVAC) system parameters, room units and elements. All sample buildings were less than six years old, equipped with dedicated outdoor air ventilation system and room conditioning units. The on-site measurements consisted of TC and DR assessment with indoor climate questionnaire (ICQ). The purpose of the survey is to assess the correspondence between HVAC design and the actual situation. Results show, whether and in what extent the standard-based criteria for TC is suitable for actual usage of the occupants. Preferring one room conditioning unit type or system may not guarantee better thermal environment without draught. Although some HVAC systems observed in this study should create the prerequisites for ensuring more comfort, results show that this is not the case for all buildings in this study.
ARTICLE | doi:10.20944/preprints201808.0326.v2
Subject: Social Sciences, Library & Information Science Keywords: business plan; publishing; academic libraries; open access
Online: 27 September 2018 (04:27:08 CEST)
Over the last twenty years, library publishing has emerged in higher education as a new class of publisher. Conceived as a response to commercial publishing practices that have strained library budgets and prevented scholars from openly licensing and sharing their works, library publishing is both a local service program and a broader movement to disrupt the current scholarly publishing arena. It is growing both in numbers of publishers and numbers of works produced. The commercial publishing framework which determines the viability of monetizing a product is not necessarily applicable for library publishers who exist as a common good to address the needs of their academic communities. Like any business venture, however, library publishers must develop a clear service model and business plan in order to create shared expectations for funding streams, quality markers, as well as technical and staff capacity. As the field is maturing from experimental projects to full programs, library publishers are formalizing their offerings and limitations. The anatomy of a library publishing business plan is presented and includes the principles of the program, scope of services, and staffing and governance requirements. Other aspects include production policies, financial structures, and measures of success.
ARTICLE | doi:10.20944/preprints201808.0492.v1
Subject: Social Sciences, Library & Information Science Keywords: bibliometrics; publication statistics; open Access; citation impact
Online: 29 August 2018 (10:32:21 CEST)
Based on the total scholarly article output of Norway, we investigated the coverage and degree of openness according to three bibliographic services 1) Google Scholar, 2) oaDOI by Impact Story and 3) 1findr by 1science. According to Google Scholar, we find that more than 70% of all Norwegian articles are openly available. However, degrees are profoundly lower according to oaDOI and 1findr, respectively 31% and 52%. Varying degrees are mainly caused by different interpretations of openness, with oaDOI being most restrictive. Furthermore, open shares vary considerably by discipline, with the Medcine and Health sciences at the upper and the Humanities at the lower end. We also determined the citation frequencies using Cited-by values as of Google Scholar, applying year and subject normalization. We find a significant citation advantage for open articles. However, this is not the case for all types of openness. In fact, the category Open Access journals was by far lowest cited, indicating that young journals with a declared open access policy still lack recognition.
ARTICLE | doi:10.20944/preprints201803.0067.v1
Subject: Engineering, Industrial & Manufacturing Engineering Keywords: open microcontrolled platform; data acquisition; remote measurement
Online: 8 March 2018 (15:21:13 CET)
The commercial equipment that carries out the measurement of temperature has a high cost. Therefore, this article describes the development of a temperature measurement equipment, which uses a microcontrolled platform, responsible for managing the data of the collected temperature signals and making available the acquired information, so that they can be verified in real time at the measurement site, or remotely. The construction of the temperature measurement equipment was performed using open platform hardware / software, where performance tests were carried out with the objective of developing a temperature measurement equipment that has measurement quality and low cost.
ARTICLE | doi:10.20944/preprints201705.0045.v1
Online: 5 May 2017 (05:29:10 CEST)
Dump design and scheduling are critical elements to effective mine planning, especially if several of them are required in large-scale open pit mines. Infrastructure capital and transportation costs are considerable from an early stage in the mining project, and through the life-of-mine as these dumps gradually become immense structures. Delivered mining rates, as well as certain spatial and physical constraints, provide a set of parameters of mathematical and economic relationship that creates opportunities for modelling and thus facilitates the measuring and optimization of ultimate dump design by using programming and empirical techniques while achieving economic objectives. This paper presents a methodology to model and optimize the design of a mine dump by minimizing the total haulage costs. The proposed methodology consists on: (i) Formulation of a dump model based on a system of equations relying on multiple relevant parameters; (ii) Solves by minimizing the total cost using linear programming and determines a ‘preliminary’ dump design; (iii) Through a series of iterations, modifies the ‘preliminary’ footprint by projecting it to the topography and creates the ultimate dump design. Finally, an example application for a waste rock dump illustrates this methodology.
ARTICLE | doi:10.20944/preprints201906.0294.v1
Subject: Mathematics & Computer Science, Geometry & Topology Keywords: is*-open set, is*-continuous, is*-open, is*-irresolute, is*-totally continuous, is-contra-continuous mappings, is*-separation
Online: 28 June 2019 (11:45:03 CEST)
In this paper, we introduce a new class of open sets that is called is*-open set . Also, we present the notion of is*-continuous, is*-open, is*-irresolute, is*-totally continuous, and is-contra-continuous mappings, and we investigate some properties of these mappings. Furthermore, we introduce some is*-separation axioms, and is*-mappings are related with is*-separation axioms. . .
REVIEW | doi:10.20944/preprints202203.0217.v1
Subject: Engineering, Energy & Fuel Technology Keywords: energy policy; energy conservation; climate change; global safety; open hardware; open source; photovoltaic; renewable energy; solar energy; national security
Online: 15 March 2022 (14:27:35 CET)
Free and open source hardware (FOSH) development has been shown to increase innovation and reduce economic costs. This article reviews the opportunity to use FOSH like a sanction to undercut imports and exports from a target criminal country. A formal methodology is presented for selecting strategic national investments in FOSH development to improve both national security and global safety. In this methodology, first the target country that is threatening national security or safety is identified. Next, the top imports from the target country as well as potentially other importing countries (allies) are quantified. Hardware is identified that could undercut imports/exports from the target country. Finally, methods to support the FOSH development are enumerated to support production in a commons-based peer production strategy. To demonstrate how this theoretical method works in practice it is applied as a case study to the current criminal military aggressor nation, who is also a fossil fuel exporter. The results show there are numerous existing FOSH and opportunities to develop new FOSH for energy conservation and renewable energy to reduce fossil fuel energy demand. Widespread deployment would reduce the concomitant pollution, human health impacts, and environmental desecration as well as cut financing of military operations.
ARTICLE | doi:10.20944/preprints202301.0228.v1
Subject: Earth Sciences, Geoinformatics Keywords: open source; climate indices; emissions scenarios; climate projections
Online: 12 January 2023 (13:49:09 CET)
The paper presents the open source tool climdex-kit which includes utilities to compute, analyze and visualize climate indices based on the input data, target domain and temporal extent defined by the user. It is intended to support researchers as well as practitioners and decision makers to derive, handle and interpret meaningful information for climate change studies and sectoral applications. It currently includes the computation of 30 indices based on temperature and precipitation, describing both mean and extreme climate conditions, and it is designed to work with climate model projections. The tool is written in Python and integrates utilities from the well-established Climate Data Operators (CDO) and NetCDF Operators (NCO) libraries. The specific utilities for selecting, aggregating and visualizing data are thought to help users to produce tailored results improving understanding and communication of future climate change. In order to show the functionalities of the package as well as its potential integration in regional climate services, climdex-kit was applied to an ensemble of downscaled climate model projections up to 2100 for the region Trentino – South Tyrol (north-eastern Italian Alps). The projections for a selection of indices accounting for extreme temperature and precipitation conditions were derived and different visualization choices discussed. The package climdex-kit is developed in a way that allows users to implement additional routines for calculating other indices as well as for easily adapting the routines to handle with different data types and spatio-temporal targets defined by the specific application. Effective climate services can in fact be developed only if flexible tools and customizable climate information are integrated with a clear understanding of data features and limitations.
ARTICLE | doi:10.20944/preprints202212.0018.v1
Subject: Engineering, Control & Systems Engineering Keywords: airborne wind energy; optimal control; open-source software
Online: 1 December 2022 (08:54:28 CET)
In this paper we present AWEbox, a Python toolbox for modeling and optimal control of multi-aircraft systems for airborne wind energy (AWE). AWEbox provides an implementation of optimization-friendly multi-aircraft AWE dynamics for a wide range of system architectures and modeling options. It automatically formulates typical AWE optimal control problems based on these models, and finds a numerical solution in a reliable and efficient fashion. To obtain a high level of reliability and efficiency, the toolbox implements different homotopy methods for initial guess refinement. The first type of methods produces a feasible initial guess from an analytic initial guess based on user-provided parameters. The second type implements a warmstart procedure for parametric sweeps. We investigate the software performance in two different case studies. In the first case study we solve a single-aircraft reference problem for a large number of different initial guesses. The homotopy methods reduce the expected computation time by a factor of 1.7 and and the peak computation time by a factor of 8, compared to when no homotopy is applied. Overall, the CPU timings are competitive to timings reported in the literature. When the user initialization draws on expert a priori knowledge, homotopies do not increase expected performance, but the peak CPU time is still reduced by a factor of 5.5. In the second case study, a power curve for a dual-aircraft lift-mode AWE system is computed using the two different homotopy types for initial guess refinement. On average, the second homotopy type, which is tailored for parametric sweeps, outperforms the first type in terms of CPU time by a factor of 3. In conclusion, AWEbox provides an open-source implementation of efficient and reliable optimal control methods that both control experts and non-expert AWE developers can benefit from.
REVIEW | doi:10.20944/preprints202105.0352.v1
Subject: Life Sciences, Biochemistry Keywords: 3d printing; microscopy; open-source; optics; super-resolution
Online: 14 May 2021 (16:10:24 CEST)
The maker movement has reached the optics labs, empowering researchers to actively create and modify microscope designs and imaging accessories. 3D printing has especially had a disruptive impact on the field, as it entails an accessible new approach in fabrication technologies, namely additive manufacturing, making prototyping in the lab available at low cost. Examples of this trend are taking advantage of the easy availability of 3D printing technology. For example, inexpensive microscopes for education have been designed, such as the FlyPi. Also, the highly complex robotic microscope OpenFlexure represents a clear desire for the democratisation of this technology. 3D printing facilitates new and powerful approaches to science and promotes collaboration between researchers, as 3D designs are easily shared. This holds the unique possibility to extend the open-access concept from knowledge to technology, allowing researchers from everywhere to use and extend model structures. Here we present a review of additive manufacturing applications in microscopy, guiding the user through this new and exciting technology and providing a starting point to anyone willing to employ this versatile and powerful new tool.
ARTICLE | doi:10.20944/preprints202102.0513.v1
Subject: Earth Sciences, Atmospheric Science Keywords: Sea-Level Rise; GIS; Open-Source Software; Modeling
Online: 23 February 2021 (12:39:09 CET)
Sea-level rise is a problem increasingly affecting coastal areas worldwide. The existence of Free and Open-Source Models to estimate the sea-level impact can contribute to better coastal man-agement. This study aims to develop and to validate two different models to predict the sea-level rise impact supported by Google Earth Engine (GEE) – a cloud-based platform for planetary-scale environmental data analysis. The first model is a Bathtub Model based on the uncertainty of projections of the Sea-level Rise Impact Module of TerrSet - Geospatial Monitoring and Modeling System software. The validation process performed in the Rio Grande do Sul coastal plain (S Brazil) resulted in correlations from 0.75 to 1.00. The second model uses Bruun Rule formula implemented in GEE and is capable to determine the coastline retreat of a profile through the creation of a simple vector line from topo-bathymetric data. The model shows a very high cor-relation (0.97) with a classical Bruun Rule study performed in Aveiro coast (NW Portugal). The GEE platform seems to be an important tool for coastal management. The models developed have been openly shared, enabling the continuous improvement of the code by the scientific commu-nity.
ARTICLE | doi:10.20944/preprints202102.0421.v1
Subject: Earth Sciences, Atmospheric Science Keywords: Sea-Level Rise; GIS; Open-Source Software; Modeling
Online: 18 February 2021 (13:52:49 CET)
Sea-level rise is a problem increasingly affecting coastal areas worldwide. The existence 15 of Free and Open-Source Models to estimate the sea-level impact can contribute to better coastal 16 management. This study aims to develop and to validate two different models to predict the 17 sea-level rise impact supported by Google Earth Engine (GEE) – a cloud-based platform for plan-18 etary-scale environmental data analysis. The first model is a Bathtub Model based on the uncer-19 tainty of projections of the Sea-level Rise Impact Module of TerrSet - Geospatial Monitoring and 20 Modeling System software. The validation process performed in the Rio Grande do Sul coastal 21 plain (S Brazil) resulted in correlations from 0.75 to 1.00. The second model uses Bruun Rule for-22 mula implemented in GEE and is capable to determine the coastline retreat of a profile through the 23 creation of a simple vector line from topo-bathymetric data. The model shows a very high correla-24 tion (0.97) with a classical Bruun Rule study performed in Aveiro coast (NW Portugal). The GEE 25 platform seems to be an important tool for coastal management. The models developed have been 26 openly shared, enabling the continuous improvement of the code by the scientific community.
ARTICLE | doi:10.20944/preprints202012.0016.v1
Subject: Physical Sciences, Acoustics Keywords: Open quantum systems, Tensor networks, non-equilibrium dynamics
Online: 1 December 2020 (12:20:25 CET)
Simulating the non-perturbative and non-Markovian dynamics of open quantum systems is a very challenging many body problem, due to the need to evolve both the system and its environments on an equal footing. Tensor network and matrix product states (MPS) have emerged as powerful tools for open system models, but the numerical resources required to treat finite temperature environments grow extremely rapidly and limit their applications. In this study we use time-dependent variational evolution of MPS to expore the striking theory of Tamescelli et al. that shows how finite temperture open dyanmics can be obtained from zero temperature, i.e. pure wave function, simulations. Using this approach, we produce a benchmark data set for the dynamics of the Ohmic spin-boson model across a wide range of coupling and temperatures, and also present detailed analysis of the numerical costs of simulating non-equilibrium steady states, such as those emerging from the non-perturbative coupling of a qubit to baths at different temperatures. Despite ever growing resource requirements, we find that converged non-perturbative results can be obtained, and we discuss a number of recent ideas and numerical techniques that should allow wide application of MPS to complex open quantum systems.
ARTICLE | doi:10.20944/preprints202009.0350.v1
Subject: Medicine & Pharmacology, Sport Sciences & Therapy Keywords: anterior cruciate ligament; open kinetic chain; laxity; isokinetic
Online: 16 September 2020 (05:55:45 CEST)
Rehabilitation following anterior cruciate ligament reconstruction with hamstring graft allows the patient to regain his functional capacities and to support him in the resumption of sports activities. Rehabilitation also aims to minimize the risk of recurrence, which is why it ensures that the patient's muscular capacities develop properly until they return to sport. Isokinetics helps strengthen and assess the strength of muscle groups in the thigh, but controversy exists as to its use by resistance to the open kinetic chain knee extension that would cause the transplant to distend. The objective of this study is to determine the influence of isokinetic muscle strengthening on the possible laxity of the anterior cruciate ligament and to be able to determine risk factors. The study relates to a population having benefited from anterior cruciate ligament reconstruction with hamstring graft from 3 to 6 months after surgery. Two groups are differentiated, one group exposed to isokinetism during rehabilitation, the other group, named unexposed, undergoes rehabilitation without the use of isokinetism. An anterior knee laxity test is performed 6 months postoperatively using the GNRB® machine for all subjects according to the same protocol. The test results were statistically analyzed to determine a relative risk of transplant distension for each group in the study. Comparison of the results of each group by univariate analysis did not reveal any significant result. Multivariate analysis showed interactions in the two strata of the study. It was argued that the use of isokinetics seems to have no effect on the risk of developing distension for the majority of subjects in the exposed group. A tendency towards transplant protection was perceived for each variable except the age under 25 years (RRa = 1.07). The use of isokinetics does not appear to be a cause of transplant distension in patients undergoing an anterior cruciate ligament reconstruction when this method is introduced 3 months postoperatively.
ARTICLE | doi:10.20944/preprints202008.0081.v1
Subject: Arts & Humanities, Architecture And Design Keywords: air pollution; particulate; PM2.5; open market; pedestrian traffic
Online: 4 August 2020 (08:20:44 CEST)
Market air quality is very important to the economic lives of the people which is rarely researched, however, market activities particularly pedestrian traffic releases particulates which is detrimental to the health of the users and stakeholders. Thermo scientific MIE pDR-1500 particulate was used to monitor the quality of air within the market for eight (8) weeks, air pollutant of concern is PM2.5. ten (10) sample points were located in the market which covers ten (10) sample points for pedestrian traffic to represent the entire market environment spectrum. The analysis of PM2.5 measured daily during dry and wet season shows a clear seasonal variation of this particular pollutant as elevated concentration was measured during the dry season than the wet season. The assessment of PM2.5 concentration shows exceedances of the standards stated by WHO and NAAQS during the dry season which ranges from 47.9 μg/m3- 231.88 μg/m3 in the morning and 65.17 μg/m3- 1806.33 μg/m3 in the afternoon. From the findings, pedestrian traffic contributes immensely to air pollution in an open market, with this elevated concentration, prolonged exposure is highly detrimental to health. This study creates awareness to the pedestrians in an open market about air pollution and informs policy changes.
ARTICLE | doi:10.20944/preprints202007.0537.v1
Online: 23 July 2020 (08:10:38 CEST)
Energy use is of crucial importance for the global challenge of climate change but also an essential part of daily life. Hence, research on energy needs to be robust and valid. Other scientific disciplines have experienced a reproducibility crisis, that is, existing findings could not be reproduced in new studies, and energy research might be impacted as well. In this paper, we suggest the ‘TReQ’ approach to improve the research practices in the energy field and arrive at greater Transparency, Reproducibility, and Quality. We acknowledge the specific challenges of energy research and suggest a highly adaptable suite of tools that can be applied to research approaches across this multi-disciplinary and fast-changing field. In particular, we introduce preregistration of studies, making data and code publicly available, using preprints, and employing reporting guidelines to heighten the standard of research practices within the energy field. We argue that through wider adoption of these tools, we will be able to have greater trust in the findings of research used to inform evidence-based policy and practice in the energy field.
Subject: Social Sciences, Library & Information Science Keywords: scientific publishing; scientific journals; scholarly publishing; scientific papers; open science; scientific articles
Online: 20 August 2020 (09:48:21 CEST)
In the digital era in which over 4 billion people regularly access the internet, the conventional process of publishing scientific articles in academic journals following peer review is undergoing profound changes. Following physics and mathematics scholars who started to publish their work on the freely accessible arXiv server in the early 1990s, researchers of all disciplines increasingly publish scientific articles in the form of freely accessible and fully citeable preprints before or in parallel to conventional submission to academic journals for peer review. The full transition to open science, I argue in this study, requires to expand the education of students and young researchers to include scholarly communication in the digital era.
REVIEW | doi:10.20944/preprints202003.0362.v1
Online: 24 March 2020 (14:46:29 CET)
With the current rapid spread of COVID-19, global health systems are increasingly overburdened by the sheer number of people that need diagnosis, isolation and treatment. Shortcomings are evident across the board, from staffing, facilities for rapid and reliable testing to availability of hospital beds and key medical-grade equipment. The scale and breadth of the problem calls for an equally substantive response not only from frontline workers such as medical staff and scientists, but from skilled members of the public who have the time, facilities and knowledge to meaningfully contribute to a consolidated global response. Here, we summarise community-driven approaches based on Free and Open Source scientific and medical Hardware (FOSH) currently being developed and deployed to bolster access to personal protective equipment (PPE), patient treatment and diagnostics.
ARTICLE | doi:10.20944/preprints201908.0070.v1
Subject: Social Sciences, Geography Keywords: city; large urban regions; Russia; globalization; open database
Online: 6 August 2019 (08:33:24 CEST)
This study explores how to delineate Russian cities in order to make them comparable on the world scale. In doing so we introduce the concept of large urban regions (LUR) applicable to the Russian urban context. This research is motivated by a principal research question: how to construct a statistical urban delineation, which would allow first, to demonstrate integration of cities into globalization, and second, to make global urban comparative research. Previous studies on urban delineation in Russia have focused almost exclusively on functional urban areas, which have substantial limitations and are not suitable for global urban comparisons. Addressing this research gap, we propose a new definition of Large Urban Regions (LUR). In doing so, first, we introduce the context of Russian cities (2), then we discuss existing Russian urban concepts (3), and justify a need for a new urban delineation (4). Afterwards, we present a general method to delineate Large Urban Regions in Russian context (5.1), and illustrate it in the two case studies of St. Petersburg (polycentric region) and Samara (monocentric region) (5.2). In the last part (6), we discuss the 10 the largest urban regions in Russia and describe a constructed database including all Russian LURs.
ARTICLE | doi:10.20944/preprints201810.0245.v1
Subject: Mathematics & Computer Science, Information Technology & Data Management Keywords: R; Open data; API; Statistics; DSS; Web service.
Online: 11 October 2018 (17:17:15 CEST)
We present a methodology to enable users to interact with statistical information owned by an institution and stored in a cloud infrastructure. Mainly based on R, this approach was developed following the open-data philosophy. Also, since we use R, the implementation is mainly based on open-source software. R gives several advantages from the point of view of data management and acquisition, as it becomes a common framework that can be used to structure the processes involved in any statistical operation. This simplifies the access to the data and enable to use all the power of R in the cloud information. This methodology was applied successfully to develop a tool to manage the data of the Centre d’Estudis d’Opinió, but it can be applied to other institutions to enable open access to its data. The infrastructure also was deployed to a cloud infrastructure, to assure the scalability and a 24/7 access.
ARTICLE | doi:10.20944/preprints201809.0543.v1
Subject: Medicine & Pharmacology, Behavioral Neuroscience Keywords: Open Science, Data Sharing, Neuroimaging, Reproducibility, Transparency, Reform
Online: 27 September 2018 (11:50:12 CEST)
Ongoing debates regarding the virtues and challenges of implementing open science for brain imaging research mirror those of the larger scientific community. The present commentary acknowledges the merits of arguments on both sides, as well as the underlying realities that have forced so many to feel the need to resist the implementation of an ideal. Potential sources of top-down reform are discussed, along with the factors that threaten to slow their progress. The potential roles of generational change and the individual are discussed, and a starter list of actionable steps that any researcher can take, big or small, is provided.
ARTICLE | doi:10.20944/preprints201805.0470.v1
Subject: Earth Sciences, Environmental Sciences Keywords: remote sensing; python; data management; landsat; open-source
Online: 31 May 2018 (11:12:27 CEST)
Many remote sensing analytical data products are most useful when they are in an appropriate regional or national projection, rather than globally based projections like Universal Transverse Mercator (UTM) or geographic coordinates, i.e., latitude and longitude. Furthermore, leaving data in the global systems can create problems, either due to misprojection of imagery because of UTM zone boundaries, or because said projections are not optimised for local use. We developed the open-source Irish Earth Observation (IEO) Python module to maintain a local remote sensing data library for Ireland. This pure Python module, in conjunction with the IEOtools Python scripts, utilises the Geospatial Data Abstraction Library (GDAL) for its geoprocessing functionality. At present, the module supports only Landsat TM/ETM+/OLI/TIRS data that have been corrected to surface reflectance using the USGS/ESPA LEDAPS/ LaSRC Collection 1 architecture. This module and the IEOtools catalogue available Landsat data from the USGS/EROS archive, and includes functions for the importation of imagery into a defined local projection and calculation of cloud-free vegetation indices. While this module is distributed with default values and data for Ireland, it can be adapted for other regions with simple modifications to the configuration files and geospatial data sets.
TECHNICAL NOTE | doi:10.20944/preprints201804.0047.v1
Online: 4 April 2018 (06:00:40 CEST)
The exceptional increase in molecular DNA sequence data in open repositories is mirrored by an ever-growing interest among evolutionary biologists to harvest and use those data for phylogenetic inference. Many quality issues, however, are known and the sheer amount and complexity of data available can pose considerable barriers to their usefulness. A key issue in this domain is the high frequency of sequence mislabelling encountered when searching for suitable sequences for phylogenetic analysis. These issues include the incorrect identification of sequenced species, non-standardised and ambiguous sequence annotation, and the inadvertent addition of paralogous sequences by users, among others. Taken together, these issues likely add considerable noise, error or bias to phylogenetic inference, a risk that is likely to increase with the size of phylogenies or the molecular datasets used to generate them. Here we present a software package, phylotaR, that bypasses the above issues by using instead an alignment search tool to identify orthologous sequences. Our package builds on the framework of its predecessor, PhyLoTa, by providing a modular pipeline for identifying overlapping sequence clusters using up-to-date GenBank data and providing new features, improvements and tools. We demonstrate our pipeline’s effectiveness by presenting trees generated from phylotaR clusters for two large taxonomic clades: palms and primates. Given the versatility of this package, we hope that it will become a standard tool for any research aiming to use GenBank data for phylogenetic analysis.
ARTICLE | doi:10.20944/preprints201711.0181.v3
Subject: Engineering, Industrial & Manufacturing Engineering Keywords: 3D printing; open source; RepRap; calibration; bed levelling
Online: 12 January 2018 (07:35:31 CET)
Inexpensive piezoelectric diaphragms can be used as sensors to facilitate both nozzle height setting and build platform leveling in FFF (Fused Filament Fabrication) 3D printers. Tests simulating nozzle contact are conducted to establish the available output and an output of greater than 8 Volts found at 20 ºC, a value which is readily detectable by simple electronic circuits. Tests are also conducted at a temperature of 80 ºC and, despite a reduction of greater than 80% in output voltage, this is still detectable. The reliability of piezoelectric diaphragms is investigated by mechanically stressing samples over 100,000 cycles at both 20 ºC and 80 ºC and little loss of output over the test duration is found. The development of a nozzle contact sensor using a single piezoelectric diaphragm is described.
ARTICLE | doi:10.20944/preprints201702.0055.v1
Online: 15 February 2017 (11:20:31 CET)
The process of modelling energy systems is accompanied by challenges inherently connected with mathematical modelling. However, due to modern realities in the 21st century, existing challenges are gaining in magnitude and are supplemented with new ones. Modellers are confronted with a rising complexity of energy systems and high uncertainties on different levels. In addition, interdisciplinary modelling is necessary for getting insight in mechanisms of an integrated world. At the same time models need to meet scientific standards as public acceptance becomes increasingly important. In this intricate environment model application as well as result communication and interpretation is also getting more difficult. In this paper we present the open energy modelling framework (oemof) as a novel approach for energy system modelling and derive its contribution to existing challenges. Therefore, based on literature review, we outline challenges for energy system modelling as well as existing and emerging approaches. Based on a description of the philosophy and elementary structural elements of oemof, a qualitative analysis of the framework with regard to the challenges is undertaken. Inherent features of oemof such as the open source, open data, non-proprietary and collaborative modelling approach are preconditions to meet modern realities of energy modelling. Additionally, a generic basis with an object-oriented implementation allows to tackle challenges related to complexity of highly integrated future energy systems and sets the foundation to address uncertainty in the future. Experiences from the collaborative modelling approach can enrich interdisciplinary modelling activities. Our analysis concludes that there are remaining challenges that can neither be tackled by a model nor a modelling framework. Among these are problems connected to result communication and interpretation.
ARTICLE | doi:10.20944/preprints202012.0612.v1
Subject: Social Sciences, Accounting Keywords: Academic journals; Growth of knowledge; Non-peer review; Open access; Open science; Paradigm change; Peer review; Scholarly communication; Science communication; Simplicity
Online: 24 December 2020 (09:31:35 CET)
This article challenges the assumption that journals and peer review are essential for developing, evaluating and disseminating scientific and other academic knowledge. It suggests a more flexible ecosystem, and examines some of the possibilities this might facilitate. The market for academic outputs should be opened up by encouraging the separation of the dissemination service from the evaluation service. Publishing research in subject-specific journals encourages compartmentalizing research into rigid categories. The dissemination of knowledge would be better served by an open access, web-based repository system encompassing all disciplines. Reviews from peers should be supplemented by reviews from non-peers from a variety of different perspectives: user reviews, statistical reviews, reviews from the perspective of different disciplines, and so on. This should reduce the inevitably conservative influence of relying on two or three peers, and make the evaluation system more critical, multi-dimensional and responsive to the requirements of different audience groups, changing circumstances, and new ideas. Non-peer review might make it easier to challenge dominant paradigms, and expanding the potential audience beyond a narrow group of peers might encourage the criterion of simplicity to be taken more seriously - which is essential if knowledge is to continue to progress.
ARTICLE | doi:10.20944/preprints202003.0073.v1
Subject: Mathematics & Computer Science, Information Technology & Data Management Keywords: digital object; data infrastructure; research infrastructure; data management; data science; FAIR data; open science; European Open Science Cloud; EOSC; persistent identifier
Online: 5 March 2020 (02:30:06 CET)
Data science is facing the following major challenges: (1) developing scalable cross-disciplinary capabilities, (2) dealing with the increasing data volumes and their inherent complexity, (3) building tools that help to build trust, (4) creating mechanisms to efficiently operate in the domain of scientific assertions, (5) turning data into actionable knowledge units and (6) promoting data interoperability. As a way to overcome these challenges, we further develop the proposals by early Internet pioneers for Digital Objects as encapsulations of data and metadata made accessible by persistent identifiers. In the past decade, this concept was revisited by various groups within the Research Data Alliance and put in the context of the FAIR Guiding Principles for findable, accessible, interoperable and reusable data. The basic components of a FAIR Digital Object (FDO) as a self-contained, typed, machine-actionable data package are explained. A survey of use cases has indicated the growing interest of research communities in FDO solutions. We conclude that the FDO concept has the potential to act as the interoperable federative core of a hyperinfrastructure initiative such as the European Open Science Cloud (EOSC).
ARTICLE | doi:10.20944/preprints201907.0263.v1
Subject: Chemistry, Analytical Chemistry Keywords: ambient ionization; mass spectrometry; high-throughput sampling; imaging; modular robot; open hardware; lab automation; peer production; open software; low-temperature plasma
Online: 23 July 2019 (15:20:38 CEST)
Abstract: Mass spectrometry research laboratories reported multiple probes for ambient ionization in the last years. Combining them with a mechanical moving stage enables automated sampling and imaging applications. We developed a robotic platform, which is based on RepRap 3D-printer components, and therefore easy to reproduce and to adopt for custom prototypes. The minimal step width of the Open LabBot is 12.5 μm, and the sampling dimensions (x, y, z) are 18 × 15 × 20 cm. Adjustable rails in an aluminium frame construction facilitate the mounting of additional parts such as sensors, probes, or optical components. The Open LabBot uses industry-standard G-code for its control. The simple syntax facilitates the programming of the movement. We developed two programs: 1) LABI-Imaging, for direct control via a USB connection and the synchronization with MS data acquisition. 2) RmsiGUI, which integrates all steps of mass spectrometry imaging: The creation of G-code for robot control, the assembly of imzML files from raw data and the analysis of imzML files. We proved the functionality of the system by the automated sampling and classification of essential oils with a PlasmaChip probe. Further, we performed an ambient ionization mass spectrometry imaging (AIMSI) experiment of a lime slice with laser desorption low-temperature plasma (LD-LTP) ionization, demonstrating the integration of the complete workflow in RmsiGUI. The design of the Open LabBot and the software are released under open licenses to promote their use and adoption in the instrument developers’ community.
REVIEW | doi:10.20944/preprints201712.0182.v4
Subject: Mathematics & Computer Science, Information Technology & Data Management Keywords: open access initiative; challenges of data sharing; data management; open government data; human-computer interaction; documentation; human factors; standardization; information policy
Online: 17 January 2018 (11:09:52 CET)
The Release of government dataset for public use can potentially strengthen the relationship between the government and its constituents. However, research shows that there are several challenges for open data effectiveness. This paper reviews current determinants and issues associated with the open government data (OGD) procedures. The review concentrates on two ends of the spectrum: First, from the perspective of the preparation by the government, focusing on the organization of traditional governmental datasets and how the recording of the data is administered. Second, from the perspective of the users, focusing on the way in which the data is released to the general public and on human-computer interaction (HCI) issues between end-user and data-consumption interfaces. Following a thorough analysis of these two opposing challenges, the paper proposes approaches to mitigate them. This review and subsequent recommendations contribute and expand current understanding of open government data effectiveness and can lead to public policy changes, development of new procedures and strategies, and ultimately improvements at both ends of the federal open data endeavor.
ARTICLE | doi:10.20944/preprints202106.0729.v1
Subject: Behavioral Sciences, Applied Psychology Keywords: age; behaviour; open field; physical activity; anxiety; Wistar rat.
Online: 30 June 2021 (10:57:00 CEST)
The aim of this work was to study age-related changes in the behaviour of adult Wistar rats using the open field (OF) and elevated plus maze (EPM) tests. Behavioural changes related to motor activity and anxiety were of particular interest. Results showed that as male and female rats progressed from 2 to 5 months of age there was a decrease in the level of motor and exploratory activities, and an increase in the level of anxiety. Age-related changes were dependent upon initial individual characteristics of behaviour. For example, animals that demonstrated high motor activity at 2 months become significantly less active by 5 months, and animals that showed a low level of anxiety at 2 months become more anxious by 5 months. Low-activity and high-anxiety rats did not show any significant age-related changes in OF and EPM tests from 2 to 5 months of age, except for a decrease in the number of rearings in EPM. Significant individual differences in the behaviour of rats in OF and EPM tests observed at 2 months were not apparent by 5 months.
Subject: Physical Sciences, Condensed Matter Physics Keywords: open topological systems; Kitaev chains and ladders; Majorana fermions
Online: 14 June 2019 (09:41:01 CEST)
In this work the general problem of the characterization of the topological phase of an open quantum system is addressed. In particular, we study the topological properties of Kitaev chains and ladders under the perturbing effect of a current flux injected into the system using an external normal lead and derived from it via a superconducting electrode. After discussing the topological phase diagram of the isolated systems, using a scattering technique within the Bogoliubov-de Gennes formulation, we analyze the differential conductance properties of these topological devices as a function of all relevant model parameters. The relevant problem of implementing local spectroscopic measurements to characterize topological systems is also addressed by studying the system electrical response as a function of the position and the distance of the normal electrode (tip). The results show how the signatures of topological order affect the electrical response of the analyzed systems, a subset of such signatures being robust also against the effects of a moderate amount of disorder. The analysis of the internal modes of the nanodevices demonstrates that topological protection can be lost when quantum states of an initially isolated topological system are hybridized with those of the external reservoirs. The conclusions of this work could be useful in understanding the topological phases of nanowire-based mesoscopic devices.
ARTICLE | doi:10.20944/preprints201811.0099.v1
Subject: Social Sciences, Other Keywords: smart city; smart citizen; participation; smartmentality; open data; metaphor
Online: 5 November 2018 (10:19:03 CET)
The goal of the paper is to investigate the expected participation and mentality of smart citizens in smart cities. The key question is the role of the human factor in smart environments globally studied through a research corpus of the mainstream summaries, trend reports, white papers and visions of business – governmental – university research co- operations. Foremost, a short review of the changing scholarly trends is presented as a theoretical framework. Concerning its key ideas, the corpus based findings are recapped and analysed by content networks and the most referred city strategies. Besides, a critical approach reveal further required factors and risks to investigate. The ultimate goal is to understand how the smart city landscape is shaped by citizen-based strategies, open data, empowerment and responsibility. Accordingly, the paper closes with theoretical, practical and metaphor-based recommendations to support the business and political decision making, and also, the emerging scholarly trends in the context of upcoming technological-structural changes.
ARTICLE | doi:10.20944/preprints201810.0354.v1
Subject: Earth Sciences, Geoinformatics Keywords: open LiDAR; terrestrial images; building reconstruction; point cloud registration
Online: 16 October 2018 (11:20:43 CEST)
Recent advances in open data initiatives allow us to free access to a vast amount of open LiDAR data in many cities. However, most of these open LiDAR data over cities are acquired by airborne scanning, where the points on façades are sparse or even completely missing due to the viewpoint and object occlusions in the urban environment. Integrating other sources of data, such as ground images, to complete the missing parts is an effective and practical solution. This paper presents an approach for improving open LiDAR data coverage on building façades by using point cloud generated from ground images. A coarse-to-fine strategy is proposed to fuse these two different sources of data. Firstly, the façade point cloud generated from terrestrial images is initially geolocated by matching the SFM camera positions to their GPS meta-information. Next, an improved Coherent Point Drift algorithm with normal consistency is proposed to accurately align building façades to open LiDAR data. The significance of the work resides in the use of 2D overlapping points on the outline of buildings instead of limited 3D overlap between the two point clouds and the achievement to a reliable and precise registration under possible incomplete coverage and ambiguous correspondence. Experiments show that the proposed approach can significantly improve the façades details of buildings in open LiDAR data and improving registration accuracy from up to 10 meters to less than half a meter compared to classic registration methods.
ARTICLE | doi:10.20944/preprints201802.0149.v2
Subject: Social Sciences, Library & Information Science Keywords: CERN; Journal Flipping; Gold Open Access; Particle Physics; SCOAP3
Online: 19 March 2018 (14:15:52 CET)
Gigantic particle accelerators, incredibly complex detectors, an antimatter factory and the discovery of the Higgs boson – this is part of what makes CERN famous. Only a few know that CERN also hosts the world largest Open Access initiative: SCOAP3. The Sponsoring Consortium for Open Access Publishing in Particle Physics (SCOAP3) started operation in 2014 and has since supported the publication of 19,000 Open Access articles in the field of particle physics, at no direct cost, nor burden, for individual authors worldwide. SCOAP3 is made possible by a 3,000-institute strong partnership, where libraries re-direct funds previously used for subscriptions to ’flip’ articles to ’gold Open Access’. With its recent expansion, the initiative now covers about 90% of the journal literature of the field. This article describes the economic principles of SCOAP3, the collaborative approach of the partnership, and finally summarizes financial results after four years of successful operation.
ARTICLE | doi:10.20944/preprints202302.0010.v1
Subject: Social Sciences, Library & Information Science Keywords: Machine learning; Scientometrics; Africa; Research community; Open science; Health informatics
Online: 1 February 2023 (10:57:26 CET)
Machine learning has seen enormous growth in the last decade, with healthcare being a prime application for advanced diagnostics and improved patient care. The application of machine learning for healthcare is particularly pertinent in Africa, where many countries are resource-scarce. However, it is unclear how much research on this topic is arising from African institutes themselves, which is a crucial aspect for applications of machine learning to unique contexts and challenges on the continent. Here, we conduct a bibliometric study of African contributions to research publications related to machine learning for healthcare, as indexed in Scopus, between 1993 and 2022. We identified 3,772 research outputs, with most of these published since 2020. North African countries currently lead the way with 64.5% of publications for the reported period, yet Sub-Saharan Africa is rapidly increasing its output. We found that international support in the form of funding and collaborations is correlated with research output generally for the continent, with local support garnering less attention. Understanding African research contributions to machine learning for healthcare is a crucial first step in surveying the broader academic landscape, forming stronger research communities, and providing advanced and contextually aware biomedical access to Africa.
ARTICLE | doi:10.20944/preprints202212.0238.v1
Subject: Social Sciences, Business And Administrative Sciences Keywords: open innovation; HRM; sustainability, UAE; inbound HR; outbound HR; technology.
Online: 15 December 2022 (06:55:17 CET)
This study proposes a structure for companies to use when implementing human resource practices in open innovation. Despite the fact that open innovation has received a lot of attention in the innovation management field as companies open their doors to information exchange in an effort to spur creative thinking, there are very few empirical articles that connect this trend to the human resource management literature. Our findings are the result of an extensive qualitative investigation into Julphar Gulf Pharmaceutical Industries Manufacturers in the United Arab Emirates (UAE) and its open innovation program. Internal, external, and combined are the three primary pillars of human resource management. We also demonstrate how the evolution of the open innovation initiative is linked to the state of the art in HRM and open innovation literature. The framework identifies HRM practices for both internal and external participants in the open innovation effort. Much of this HRM is done off the books, in a setting separate from the host company. By providing actual evidence of how firms use HRM to manage open innovation projects, our research adds to the scant and mostly theoretical literature linking open innovation and HRM.
ARTICLE | doi:10.20944/preprints202107.0651.v1
Subject: Behavioral Sciences, Applied Psychology Keywords: multiple measures synchronization; automatic device integration; open-source; PsychoPy; Unity
Online: 29 July 2021 (11:48:02 CEST)
Background: The human mind is multimodal. Yet most behavioral studies rely on century-old measures such as task accuracy and latency. To create a better understanding of human behavior and brain functionality, we should introduce other measures and analyze behavior from various aspects. However, it is technically complex and costly to design and implement the experiments that record multiple measures. To address this issue, a platform that allows synchronizing multiple measures from human behavior is needed. Method: This paper introduces an opensource platform named OpenSync, which can be used to synchronize multiple measures in neuroscience experiments. This platform helps to automatically integrate, synchronize and record physiological measures (e.g., electroencephalogram (EEG), galvanic skin response (GSR), eye-tracking, body motion, etc.), user input response (e.g., from mouse, keyboard, joystick, etc.), and task-related information (stimulus markers). In this paper, we explain the structure and details of OpenSync, provide two case studies in PsychoPy and Unity. Comparison with existing tools: Unlike proprietary systems (e.g., iMotions), OpenSync is free and it can be used inside any opensource experiment design software (e.g., PsychoPy, OpenSesame, Unity, etc., https://pypi.org/project/OpenSync/ and https://github.com/moeinrazavi/OpenSync_Unity). Results: Our experimental results show that the OpenSync platform is able to synchronize multiple measures with microsecond resolution.
ARTICLE | doi:10.20944/preprints202104.0770.v1
Subject: Social Sciences, Library & Information Science Keywords: Wikipedia, knowledge equity, Wikimedia, open culture, visual arts, cultural bias
Online: 29 April 2021 (09:16:07 CEST)
We explore gaps in Wikipedia's coverage of the visual arts by comparing the representation of 100 artists and 100 artworks from the Western canon against corresponding sets of notable artists and artworks from non-Western cultures. We measure the coverage of these two sets of topics across Wikipedia as a whole and for its individual language versions. We also compare the coverage for Wikimedia Commons and Wikidata, sister-projects of Wikipedia that host digital media and structured data. We show that all these platforms strongly favour the Western canon, giving many times more coverage to Western art. We highlight specific examples of differing coverage of visual art inside and outside the Western canon. We find that European language versions of Wikipedia are generally more "Western" in their coverage and Asian languages more "global", with interesting exceptions. We suggest how both Wikipedia and the wider cultural sector can address this gap in content and thus give Wikipedia a truly global perspective on the visual arts.
ARTICLE | doi:10.20944/preprints202104.0448.v1
Subject: Social Sciences, Accounting Keywords: preprint; citations; scholarly communication; open science; peer review; impact factor
Online: 16 April 2021 (16:45:49 CEST)
Preprints are regularly cited in peer reviewed journal articles, books and conference paper. Are preprint citations somehow less important than citations to peer reviewed research papers? This study investigates citation patterns between 2017 and 2020 for preprints published in three preprint servers, one specializing in biology (bioRxiv), one in chemistry (ChemRxiv), and another hosting preprints in all disciplines (Research Square). As evaluation of scholarship continues to largely rely on citation-based metrics, this analysis and its outcomes will be useful to inform new research-based education in today’s scholarly communication.
TECHNICAL NOTE | doi:10.20944/preprints202103.0194.v1
Subject: Mathematics & Computer Science, Artificial Intelligence & Robotics Keywords: Active Learning, Classification, Machine Learning, Python, Github, Repository, Open Source
Online: 5 March 2021 (21:14:20 CET)
Machine learning applications often need large amounts of training data to perform well. Whereas unlabeled data can be easily gathered, the labeling process is difficult, time-consuming, or expensive in most applications. Active learning can help solve this problem by querying labels for those data points that will improve the performance the most. Thereby, the goal is that the learning algorithm performs sufficiently well with fewer labels. We provide a library called scikit-activeml that covers the most relevant query strategies and implements tools to work with partially labeled data. It is programmed in Python and builds on top of scikit-learn.
ARTICLE | doi:10.20944/preprints202102.0198.v1
Subject: Engineering, Automotive Engineering Keywords: 180-degree bend; Sediment transport; Clear water; Open-channels; Scour
Online: 8 February 2021 (12:12:09 CET)
As 180-degree meanders are observed in abundance in nature, a meandering channel with two consecutive 180-degree bends was designed and constructed to investigate bed topography variations. These two 180-degree mild bends are located between two upstream and downstream straight paths. In this study, different mean velocity to critical velocity ratios have been tested at the upstream straight path to determine the meander's incipient motion. To this end, bed topography variations along the meander and the downstream straight path were addressed for different mean velocity to critical velocity ratios. In addition, the upstream bend's effect on the downstream bend has been investigated. Results indicated that the maximum scour depth at the downstream bend has increased as a result of changing the mean velocity to critical velocity ratio from 0.8 to 0.84, 0.86, 0.89, 0.92, 0.95, and 0.98 by respectively 1.5, 2.5, 5, 10, 12, and 26 times. Moreover, increasing the ratio increased the maximum sedimentary height by 3, 10, 23, 48, 49, and 56 times. The upstream bend's incipient motion was observed for the mean velocity to critical velocity ratio of 0.89, while the downstream bend was equal to 0.78.
REVIEW | doi:10.20944/preprints202010.0243.v1
Subject: Social Sciences, Accounting Keywords: distance education; open and distance education; student retention; survival analysis
Online: 12 October 2020 (13:22:55 CEST)
Student retention is one indicator of accountability in the implementation of educational programs. Achievement of student retention rates indicates the performance of the quality objectives of an institution or college. To get an accurate picture of the factors related to retention, we need to do modeling. The retention variable is the time response variable measured in semester units. One of the statistical analyzes that can be used to analyze response data in time is survival analysis. The selection of an accurate analytical method in modeling will produce valid conclusions and impact making policies that are right and on target. This paper presents alternative modeling of student retention in distance education using survival analysis. The method used is a literature review. This paper also briefly describes distance education, open and distance education, distance education students' characteristics, distance education student retention, and survival models for modeling student retention in distance education.
SHORT NOTE | doi:10.20944/preprints202001.0196.v1
Subject: Biology, Entomology Keywords: reproducibility; open access; data curation; data mangement; pre-print servers
Online: 18 January 2020 (09:05:49 CET)
The ability to replicate scientific experiments is a cornerstone of the scientific method. Sharing ideas, workflows, data, and protocols facilitates testing the generalizability of results, increases the speed that science progresses, and enhances quality control of published work. Fields of science such as medicine, the social sciences, and the physical sciences have embraced practices designed to increase replicability. Granting agencies, for example, may require data management plans and journals may require data and code availability statements along with the deposition of data and code in publicly available repositories. While many tools commonly used in replicable workflows such as distributed version control systems (e.g. “git”) or scripted programming languages for data cleaning and analysis may have a steep learning curve, their adoption can increase individual efficiency and facilitate collaborations both within entomology and across disciplines. The open science movement is developing within the discipline of entomology, but practitioners of these concepts or those desiring to work more collaboratively across disciplines may be unsure where or how to embrace these initiatives. This article is meant to introduce some of the tools entomologists can incorporate into their workflows to increase the replicability and openness of their work. We describe these tools and others, recommend additional resources for learning more about these tools, and discuss the benefits to both individuals and the scientific community and potential drawbacks associated with implementing a replicable workflow.
ARTICLE | doi:10.20944/preprints201909.0162.v1
Subject: Engineering, Mechanical Engineering Keywords: thermal actuator; compliant architecture; open and closed operating cycles; mesoscale
Online: 16 September 2019 (10:56:57 CEST)
Thermal-based actuators are known for generating large force and displacement strokes at mesoscale (millimeter) regime. In particular, two-phase thermal actuators are found to benefit from the scaling laws of physics at mesoscale to offer large force and displacement strokes; but they have low thermal efficiencies. As an alternative, a combustion-based thermal actuator is proposed and its performance is studied in both open and closed cycle operations. Through a physics-based lumped-parameter model, we investigate the behavior and performance of the actuator using a spring-mass-damper analogy and taking an air standard cycle approach. Three observations are reported: (1) the mesoscale actuator can generate peak forces of up to 400 N and displacement strokes of about 16 cm suitable for practical applications; (2) an increase in heat input to the actuator results in increasing the thermal efficiency of the actuator for both open and closed cycles; and (3) for a specific heat input, both the open and closed cycle operations respond differently \textemdash different stroke lengths, peak pressures, and thermal efficiencies.
ARTICLE | doi:10.20944/preprints201906.0154.v1
Subject: Social Sciences, Library & Information Science Keywords: Open Access; institutional repositories; institutional mandates; self-archiving; Estudo Geral
Online: 17 June 2019 (07:08:09 CEST)
Changes brought about by the Internet to Scholarly Communication and the spread of Open Access movement, have made it possible to increase the number of potential readers of published research dramatically. This two-phase study aims, at first, to assert the satisfaction of the potential for increased open access to articles published by authors at the University of Coimbra, in a context when there was no stimulus for the openness of published science other than an institutional mandate set by the University policy on Open Access (“Acesso Livre”). The satisfaction of the access openness was measured by observing the actual archiving behavior of researchers (either directly or through their agents). We started by selecting the top journal titles used to publish the STEM research of the University of Coimbra (2004-2013) by using Thomson Reuters’ Science Citation Index (SCI). These titles were available at the University libraries or through online subscriptions, some of them in open access (21%). By checking the journals' policy at the time regarding self-archiving at the SHERPA/RoMEO service, we found that the percentage of articles in Open Access (OA) could rise to 80% if deposited at Estudo Geral, the Institutional Repository of the University of Coimbra, as prescribed by the Open Access Policy of the University. As we concluded by verifying the deposit status of every single paper of researchers of the University that published in those journals, this potential was far from being fulfilled, despite the existence of the institutional mandate and favorable editorial conditions. We concluded, therefore, that an institutional mandate was not sufficient by itself to fully implement an open access policy and to close the gap between publication and access. The second phase of the study, to follow, will rescan the status of published papers in a context where the Portuguese public funding agency, the Fundação para a Ciência e a Tecnologia, introduced in 2014 a new significant stimulus for open access in science. The FCT Open Access Policy stipulates that publicly funded published research must be available as soon as possible in a repository of the Portuguese network of scientific repositories, RCAAP, which integrates the Estudo Geral.
ARTICLE | doi:10.20944/preprints201905.0015.v1
Subject: Medicine & Pharmacology, General Medical Research Keywords: laparoscopic; open surgery; non-metastatic colorectal cancer; single surgeon experience
Online: 5 May 2019 (11:25:43 CEST)
The oncologic merits of laparoscopic technique for colorectal cancer surgery remain debatable. Eligible patients with non-metastatic colorectal cancer who were scheduled for an elective resection by only one surgeon in a medical institution were randomized to either laparoscopic or open treatment. During this period, total 188 patients received laparoscopic surgery and other 163 patients to open approach. The primary endpoint was cancer-free 5-year survival after operative treatment and secondary endpoint was the tumor recurrence incidence. We found there was no statistically significant difference between open and laparoscopic groups regarding average number of lymph nodes dissected, overall mortality rate, cancer recurrence rate or cancer-free 5-year survival. Nevertheless, laparoscopic approach was more effective for colorectal cancer treatment with shorter hospital stay and less blood loss despite operation time was significantly longer. Meanwhile fewer patients receiving laparoscopic approach developed postoperative urinary tract infection, wound infection, pneumonia or anastomosis leakage, which reached statistical significance. For non-metastatic colorectal cancer patients, laparoscopic surgery resulted in better short-term outcomes whether in total complications and intra-operative blood loss. Though there was no significant statistical difference in terms of cancer-free 5-year survival and tumor recurrence, we favor patients receiving laparoscopic surgery if not contraindicated.
ARTICLE | doi:10.20944/preprints201904.0008.v1
Subject: Earth Sciences, Geoinformatics Keywords: GRASS GIS; g.citation; software citation; open science; OSGeo; credit; rewards
Online: 1 April 2019 (10:19:53 CEST)
The authors introduce the GRASS GIS add-on module g.citation as an initial implementation of a fine-grained software citation concept. The module extends the existing citation capabilities of GRASS GIS, which until now only provide for automated citation of the software project as a whole, authored by the GRASS Development Team, without reference to individual persons. The functionalities of the new module enable individual code citation for each of the over 500 implemented functionalities, including add-on modules. Three different classes of citation output are provided in a variety human- and machine-readable formats. The implications of this reference implementation of scientific software citation for both for the GRASS GIS project and the OSGeo foundation are outlined.
ARTICLE | doi:10.20944/preprints201901.0165.v3
Subject: Social Sciences, Library & Information Science Keywords: Plan S; open access journals; APC; technical requirements; publisher size
Online: 22 January 2019 (11:39:01 CET)
Much of the debate on Plan S seems to concentrate on how to make toll access journals open access, taking for granted that existing open access journals are Plan S compliant. We suspected this was not so, and set out to explore this using DOAJ's journal metadata. We conclude that an overwhelmingly large majority of open access journals are not Plan S compliant, and that it is small HSS publishers not charging APCs that are least compliant and will face major challenges with becoming compliant. Plan S need to give special considerations to smaller publishers and/or non-APC-based journals.
ARTICLE | doi:10.20944/preprints201706.0093.v2
Subject: Keywords: decision support; energy system modelling; optimization; collaborative development; open science
Online: 27 March 2018 (05:34:38 CEST)
Energy system models have become indispensable to shape future energy systems by providing insights into different trajectories. However, sustainable systems with high shares of renewable energy are characterized by growing cross-sectoral interdependencies and decentralized structures. To capture important properties of increasingly complex energy systems, sophisticated and flexible modelling tools are needed. At the same time open science becomes increasingly important in energy system modelling. This paper presents the Open Energy Modelling Framework (oemof) as a novel approach in energy system modelling, representation and analysis. The framework forms a toolbox to construct comprehensive energy system models and has been published open source under a free license. With a collaborative development based on open processes the framework seeks for a maximum level of participation and transparency to facilitate open science principles in energy system modelling. Based on a generic graph based description of energy systems it is well suited to flexibly model complex cross-sectoral systems and incorporate various modelling approaches. This makes the framework a multi-purpose modelling environment for modelling and analyzing different systems ranging from an urban to a transnational scale.
ARTICLE | doi:10.20944/preprints201708.0022.v1
Subject: Mathematics & Computer Science, Other Keywords: real‐time reconstruction; SLAM; kinect sensors; depth cameras; open source
Online: 7 August 2017 (11:03:23 CEST)
Given a stream of depth images with a known cuboid reference object present in the scene, we propose a novel approach for accurate camera tracking and volumetric surface reconstruction in real-time. Our contribution in this paper is threefold: (a) utilizing a priori knowledge of the cuboid reference object, we keep drift-free camera tracking without explicit global optimization; (b) we improve the fineness of the volumetric surface representation by proposing a prediction-corrected data fusion strategy rather than simple moving average, which enables accurate reconstruction of high-frequency details such as sharp edges of objects and geometries of high curvature; (c) we introduce a benchmark dataset CU3D containing both synthetic and real-world scanning sequences with ground-truth camera trajectories and surface models for quantitative evaluation of 3D reconstruction algorithms. We test our algorithm on our dataset and demonstrate its accuracy compared with other state-of-the-art algorithms. We release both our dataset and code as opensource1 for other researchers to reproduce and verify our results.
ARTICLE | doi:10.20944/preprints202212.0199.v1
Subject: Behavioral Sciences, Other Keywords: Open defecation; rural women; Ghana; Environmental Health; Demographic and Health Survey
Online: 12 December 2022 (10:03:35 CET)
The study investigated determinants of open defecation among rural women in Ghana. The study extracted data from the female’s file of the 2003, 2008 and 2014 Ghana Demographic and Health Survey (GDHS). A total of 4,284 pooled sample size of rural women aged 15-49 with complete information about the variables analyzed in the study. The outcome variable was “open defecation” (i.e., defecating in an open space rather than a toilet facility) whilst fourteen (14) key explanatory variables were used. Two regression models were built, and output reported in odds ratio. Descriptively, 42 in every 100 women aged 15-49 practised open defecation (n=1811, 95’CI=49-52). Open defecation significantly correlated with educational attainment, wealth status, religion, access to mass media, partner's education, and zone of residence. The likelihood to practice open defecation reduced among those with formal education [aOR=0.69, CI=0.56-0.85], those whose partners had formal education [aOR=0.64, CI=0.52-0.80], women in the rich wealth quintile [aOR=0.12, CI=0.07-0.20], the traditionalist [aOR=0.33, CI=0.19-0.57], and those who had access to mass media [aOR=0.70, CI=0.57-0.85]. Residents in the Savannah zone were over 21-fold higher to defecate openly [aOR=21.06, CI=15.97-27.77]. The prevalence of open defecation is disproportionately pro-poor indicating that impoverished rural women are more likely to perform it.
ARTICLE | doi:10.20944/preprints202209.0470.v1
Subject: Social Sciences, Marketing Keywords: Brand rank; content marketing; predictive model; open data policy; e-commerce
Online: 30 September 2022 (02:02:05 CEST)
Background Content marketing is increasingly important for online branding. Brand popularity can be more easily determined online than sales-based measures but is not yet well-explained from a content marketing perspective. Promising predictors are open data syndication policies, connectivity to e-commerce platforms, product reviews, data health, and the depth and width of a brands product portfolio. A predictive content marketing model can help brand owners to understand their e-commerce potential. Methods We used brand popularity (Brand Popularity Rank) and catalog data in combination with product reviews from an independent content aggregator. For all datasets, we selected the overlapping dataset for brand popularity and brand reviews based on a period of 90 days from June 10, 2022, till September 24, 2022 (n = 333 brands). Backward stepwise multiple linear regression was used to develop a predictive content marketing model of the Brand Popularity Rank. Results Through stepwise backward multiple linear regression five highly significant (p < 0.01) predictive factors for brand rank are selected in our content marketing model: the brand’s data syndication policy, the number of connected e-commerce platforms, a brand’s number of products, its number of products per category, and the number of product categories in which it is active. Our model explains 78% of the variance of Brand Popularity Rank and has a good and highly significant fit: F (5, 327) = 233.5, p < 0.00001. Conclusions We conclude that a content marketing model can adequately predict a Brand Popularity Rank based on online popularity. In this model an open content syndication policy, more connected e-commerce platforms, and catalog size, i.e., presence in more categories and more products per category are each related to a better (lower) Brand Popularity Rank score.
ARTICLE | doi:10.20944/preprints202209.0023.v1
Subject: Mathematics & Computer Science, Applied Mathematics Keywords: Pancreatic cancer; cancer evolution; tumour microenvironment; mathematical model; open quasispecies model
Online: 1 September 2022 (10:52:10 CEST)
Pancreatic cancer represents one of the difficult problems of contemporary medicine. The development illness evolves very slowly, takes place in special sake (stroma) and manifests clinically close to a final stage. Another feature of this pathology is a coexistence (symbiotic) effect between cancer cells and normal cells inside stroma. All these aspects make it difficult to understand the pathogenesis of pancreatic cancer and develop a proper therapy. The emergence of pancreatic pre-cancer and cancer cells represents a branching stochastic process engaging populations of 64 cells differing in the number of acquired mutations. In this study we formulate and calibrate the mathematical model of pancreatic cancer using the quasispecies framework. The mathematical model incorporates the mutation matrix, fineness landscape matrix and the death rates. Each element of the mutation matrix presents the probability of appearing a specific mutation in the branching sequence of cells representing the accumulation of mutations. The model incorporates the cancer cell elimination by effect CD8 T cells (CTL). The down-regulation of the effector function of CTLs and exhaustion are parameterized. The symbiotic effect of coexistence of normal and cancer cells is considered. The computational predictions obtained with the model are consistent with empirical data. The modelling approach can be used to investigate other types of cancers and examine various treatment procedures.