ARTICLE | doi:10.20944/preprints202106.0217.v1
Subject: Engineering, Automotive Engineering Keywords: solar collection; solid structure; heat absorption analysis; collection heat analysis; reflection spectrum
Online: 8 June 2021 (12:06:13 CEST)
A solid structure, such as a road, building wall or envelop, used as a solar collector is considered an effective and new way to use renewable energy. This paper focused on the temperature characteristics of four structures exposed to sunshine: asphalt, red brick, composite cement and concrete road slab. Furthermore, the collected heat based on a hydraulic system was investigated experimentally. For the four structure slabs, their temperature differences are due to solar radiation absorption varied greatly by the material’s heat absorptance and color. Through the test, asphalt slab attained the highest temperature and had the weakest reflection among the structures. Compared with the others, the temperature of the asphalt slab was greater by 8.1%, 14.9% and 16.4% than the brick, composite cement and concrete, respectively. The reflection intensity growth ratio was defined and indicates the growth potential for absorbing radiation in the solid slab surface. From the experiments, it was concluded that a suitable selection of road materials can greatly improve the thermal absorption, conduction and penetration into the solid slab. The collected heat capability was approximately 250 W/m2 to 350 W/m2 in the natural summer condition. A black coating or a surface modification can collect more heat, reaching greater than 250 W/m2. The solar collecting heat efficiency with a surface configuration of the road slab can reach above 30% in the summer time.
ARTICLE | doi:10.20944/preprints202008.0608.v1
Online: 27 August 2020 (09:43:59 CEST)
Bacterial collections are invaluable tools for microbiologists. However, their practical use is compromised by imprecise taxonomical assignation of bacterial strains. This is particularly true for soft rotting plant pathogens of the Pectobacterium genus. To solve this difficulty, we analyzed the taxonomic status of 265 Pectobacterium strains deposited at CIRM-CFBP collection from 1944 to 2020. This collection gathered Pectobacterium strains isolated in 27 countries from 32 plant species representing 17 botanical families or from non-host environments. MLSA approach completed by genomic analysis of 15 strains was performed to update the taxonomic status of these 265 strains. Results showed that the CIRM-CFBP Pectobacterium collection harboured at least one strain of each species to the exception of P. polonicum. Yet, 6 strains could not be assigned to any of the described species and may represent at least two new species. Surprisingly, the P. versatile species, recently described in 2019, is the most prevalent species among CIRM-CFBP strains. Analysis of P. versatile strains revealed that this species is endemic all over the world on various host plants and environments. At the opposite, other species gathered strains isolated from only one botanical family or exclusively from fresh water environment. Our work also revealed new host plants for several Pectobacterium spp.
ARTICLE | doi:10.20944/preprints201808.0029.v1
Subject: Earth Sciences, Environmental Sciences Keywords: Landsat, analysis ready data, collection 1
Online: 1 August 2018 (20:03:52 CEST)
Data that have been processed to allow analysis with a minimum of additional user effort are often referred to as Analysis Ready Data (ARD). The ability to perform large scale Landsat analysis relies on the ability to access observations that are geometrically and radiometrically consistent, and have had non-target features (clouds) and poor quality observations flagged so that they can be excluded. The United States Geological Survey (USGS) has processed all of the Landsat 4 and 5 Thematic Mapper (TM), Landsat 7 Enhanced Thematic Mapper Plus (ETM+), Landsat 8 Operational Land Imager (OLI) and Thermal Infrared Sensor (TIRS) archive over the conterminous United States (CONUS), Alaska, and Hawaii, into Landsat ARD. The ARD are available to significantly reduce the burden of pre-processing on users of Landsat data. Provision of pre-prepared ARD is intended to make it easier for users to produce Landsat-based maps of land cover and land-cover change and other derived geophysical and biophysical products. The ARD are provided as tiled, georegistered, top of atmosphere and atmospherically corrected products defined in a common equal area projection, accompanied by spatially explicit quality assessment information, and
ARTICLE | doi:10.20944/preprints202109.0371.v1
Online: 22 September 2021 (10:20:11 CEST)
The newly discovered coronavirus (COVID-19) has disrupted traditional methods of conducting research, particularly qualitative research. However, there remains a number of methods by which qualitative data can still be collected. These include the use of digital voice, video, and text-based tools, online surveys, and content analysis. Text-based sources can help to overcome the limitations of time and space, and also can be cost-effective. This chapter draws from data collected from 12 participants across Zimbabwe and demonstrates how these tools can be used to generate data or to sample data that is already available to satisfy research questions and meet research objectives. It recommends researchers to experiment with new ways of collecting qualitative data while also observing safety protocols and ethical considerations.
ARTICLE | doi:10.20944/preprints202110.0165.v1
Subject: Engineering, Electrical & Electronic Engineering Keywords: Dark Net; Dark Web; COVID-19; data collection
Online: 11 October 2021 (14:19:46 CEST)
The Dark Web is known as a place triggering a variety of criminal activities. Anonymization techniques enable illegal operations, leading to the loss of confidential information and its further use as bait, a trade product or even a crime tool. Despite technical progress, there is still not enough awareness of the Dark Web and its secret activity. In this study, we introduced the Dark Web Enhanced Analysis (DWEA), in order to analyze and gather information about the content accessed on the Dark Net based on data characteristics. The research was performed to identify how the Dark Web has been influenced by recent global events, such as the COVID-19 epidemic. The research included the usage of a crawler, which scans the network and collects data for further analysis with machine learning. The result of this work determines the influence of the COVID-19 epidemic on the Dark Net.
BRIEF REPORT | doi:10.20944/preprints202004.0009.v5
Online: 7 July 2020 (18:09:21 CEST)
The world is facing a major health crisis, the global pandemic of COVID-19 caused by the SARS-CoV-2 coronavirus, for which no approved antiviral agents or vaccines are currently available. Here we describe a collection of codon-optimized coding sequences for SARS-CoV-2 cloned into Gateway-compatible entry vectors, which enable rapid transfer into a variety of expression and tagging vectors. The collection is freely available via Addgene. We hope that widespread availability of this SARS-CoV-2 resource will enable many subsequent molecular studies to better understand the viral life cycle and how to block it.
ARTICLE | doi:10.20944/preprints202105.0341.v1
Subject: Earth Sciences, Atmospheric Science Keywords: Soil erosion; Winds; Sand collection efficiency; Dust horizontal flux
Online: 14 May 2021 (14:45:47 CEST)
The sand-dust horizontal flux is an important parameter for the study on aeolian sand transport, as well as an important foundation. In this study, a field experiment was developed to measure the data of aeolian transport and microclimate during different dust events with an auto sand sampler, a piezoelectric saltation sensor (H11-Sensit) and a 10 m high meteorological tower in Ta Zhong, the hinterland of the Taklimakan Desert from July to August in 2010. Then, the sampling efficiency of auto sand sampler and horizontal dust flux of near surface were analyzed based on observed data. The results were as follows: sand collector skip turnover increased with the increase of the intensity of dust weather frequency increases, the power function relationship y=2.115 x0.9841, R2 = 0.9206, flip frequency per minute increased from 0.2794 times to 1.3041 times, change is obvious; With the strength of the weather, time to flip the average sediment is shrinking. Sandstorm weather, skip to flip a volume of 3.7160 g, grade I flying sand weather flip a volume of 4.0275 g, the amount of class II flying sand weather turns over a 5. 0035g.The horizontal dust flux of different dust events that calculated with the equation Q=256M; the maximum of one dust event was about 190.335 kg, and the minimum was 1.2 kg. Overall, the sand transportation rate increased with wind speed. However, the changes of sand transportation rate did not quite fit in with wind speed during some dust events, and in this case the corresponding surface temperature was significantly higher. The experimental data obtained can provide theoretical basis for regional sand control and enacting effective engineering measures.
REVIEW | doi:10.20944/preprints201809.0567.v1
Subject: Engineering, Civil Engineering Keywords: pavement distress; pavement management; distress identification; data collection system
Online: 28 September 2018 (12:28:00 CEST)
The road pavement condition affects safety and comfort, traffic and travel times, vehicles operating cost and emission levels. In order to optimize the road pavement management and guarantee satisfactory mobility conditions for all the road users, the Pavement Management System (PMS) is an effective tool for the road manager. An effective PMS requires the availability of pavement distress data, the possibility of data maintenance and updating, in order to evaluate the best maintenance program. In the last decade, many researches have been focused on pavement distress detection, using a huge variety of technological solutions for both data collection and information extraction and qualification. This paper presents a literature review of data collection systems and processing approach aimed at the pavement condition evaluation. Both commercial solutions and research approaches have been included. The main goal is to draw a framework of the actual existing solutions, considering them from a different point of view in order to identify the most suitable for further research and technical improvement, also considering the automated and semi-automated emerging technologies. An important attempt is to evaluate the aptness of the data collection and extraction to the type of distress, considering the distress detection, classification and quantification phases of the procedure.
ARTICLE | doi:10.20944/preprints201705.0190.v1
Subject: Mathematics & Computer Science, General & Theoretical Computer Science Keywords: wireless sensor network; object tracking; dual sink; data collection
Online: 26 May 2017 (04:58:00 CEST)
Continuous object tracking in WSNs, such as monitoring of mud-rock flows, forest fires etc., is a challenging task due to characteristic nature of continuous objects. They can appear randomly in the network field, move continuously, and can change in size and shape. Monitoring such objects in real-time generally require tremendous amount of messaging between sensor nodes to synergistically estimate object’s movement and track its location. In this paper, we propose a novel twofold-sink mechanism, comprising of a mobile and a static sink node. Both sink nodes gather information about boundary sensor nodes, which is then used to uniformly distribute energy consumption across all network nodes, thus helping in saving residual energy of network nodes. Numerous object tracking schemes, using mobile sink, have been proposed in the literature. However, existing schemes employing mobile sink cannot be applied to track continuous objects, because of momentous variation of network topology. Therefore, we present in this paper a mechanism, transformed from K-means algorithm, to find the best sensing location of the mobile sink node. It helps to reduce transmission load on the intermediate network nodes situated between static sink node and the ordinary network sensing nodes. The simulation results show that the proposed scheme can distinctly improve life time of the network, compared to one-sink protocol employed in continuous object tracking.
ARTICLE | doi:10.20944/preprints202009.0304.v2
Subject: Earth Sciences, Atmospheric Science Keywords: COVID-19; waste generation; waste collection; gap assessment; emergency plans
Online: 23 June 2021 (11:39:27 CEST)
The nationwide lockdown imposed to control the spread of novel coronavirus induced dramatic alterations in different sectors of the Nepalese governance, including Solid Waste Management (SWM) practices. The study identifies SW collection gaps in seven major cities of Nepal and highlights the municipal and public households on SW management practices before and during the lockdown to emphasize the linkage between COVID-19 and SWM. It includes information on solid waste status, collection frequency and coverage, workers safety practices, types of vehicles operated for collection and alternative methods adopted by households to manage SW during the lockdown. For this, 1329 households survey and key informant interviews were conducted in seven cities of Nepal during the lockdown. It was found that although the coverage of the collection service was similar during the pandemic, there was a drastic decrease in the collection frequency leading to a collection gap of around 570 tons/day. More than 50% of the surveyed households adopted no proper alternative measures as they claimed that they stored solid waste with proper management so that municipal authorities can take it. The study reveals poor occupational health and safety practices among the solid waste workers due to the unavailability of safety gears and equipment despite being aware of the modes of transmission of the virus. The pandemic exacerbated the challenges of smooth SWM as it is an essential and needy service. This study highlights the need for a timely strategic management framework to be developed by the government to continue the smooth SWM practices during the lockdown.
ARTICLE | doi:10.20944/preprints202111.0033.v1
Subject: Biology, Animal Sciences & Zoology Keywords: Africa; biodiversity infrastructure; Clupeidae; Clupeiformes; Dactylogyridea; flatworm; historical collection; Monogenea; Pellonulini; sardine
Online: 2 November 2021 (10:28:05 CET)
Unlike their marine counterparts, tropical freshwater clupeids receive little scientific attention. However, they sustain important fisheries that may be of (inter)national commercial interest. Africa harbours over 20 freshwater clupeid species within Pellonulini. Recent research suggests their most abundant parasites are gill-infecting monogenean flatworms within Kapentagyrus. After inspecting specimens of 12 freshwater clupeids from West and Central Africa, mainly sourced in biodiversity collections, we propose 11 new species of Kapentagyrus which we describe using their haptoral and genital morphology. Because of their high morphological similarity, species delineation relies mostly on morphometrics of anchors and hooks. Specifically, earlier, molecular taxonomic work indicated that the proportion between the length of the anchor roots, and between hook and anchor length, are diagnostic. On average, about one species of Kapentagyrus exists per pellonuline species, although Pellonula leonensis harbours four species and Microthrissa congica two, while Microthrissa moeruensis and Potamothrissa acutirostris share a gill monogenean species. This study more than quadruples the number of known species of Kapentagyrus, also almost quadrupling the number of pellonuline species of which monogeneans are known. Since members of Kapentagyrus are informative about their hosts’ ecology, evolutionary history, and introduction routes, this enables a parasitological perspective on several data-poor African fisheries.
Subject: Medicine & Pharmacology, General Medical Research Keywords: COVID-19; Treatment outcome; Data Collection; Pharmaceutical Preparations; Outcome Assessment; Health Care
Online: 4 September 2020 (10:12:02 CEST)
Human infection caused by the SARS-CoV-2 virus, called COVID-19, is a new pandemic with devastating effects worldwide. Science seeks the rational and systematic explanation of phenomena. In pandemics, decisions on prevention and treatment of people should be consistently taken, supported by scientific knowledge and ethical principles to produce more good than harm. At first, prospective observational studies to systematically collect patient data, correlating protective or therapeutic interventions with outcomes to assess effectiveness and safety, should be prioritized as the most appropriate type of study. The proposed protocol in this article aims to provide doctors with information on the reduction of harm in early COVID-19 patients by applying individualized interventionist or expectant therapeutic strategies, respecting the autonomy and preferences of physicians and patients in clinical decision-making. The evaluation of the clinical status, besides laboratory confirmation of COVID-19, comprises an individualized symptom score for each patient, a global self-perception scale of the severity of the disease, a clinical progression scale developed by the WHO for clinical studies in COVID-19 and, at the first consultation, doctors´ overall impression on the clinical prognosis. The analysis of anonymized data should preferably use descriptive and inferential statistical resources. The case report form is available for free use in the protocol, along with examples of patient informed consent forms for the prescription of off-label medications and authorization to use the data. Their results may be useful to indicate interventions that are candidates for efficacy trials, in randomized controlled trials, with a higher chance of success. It respects the autonomy and preferences of doctors and patients to decide the best options for treatment in uncertain situations. It also allows the gathering of useful information for future more rigorous clinical trials, trying to link science, ethics, and personal clinical experience.
ARTICLE | doi:10.20944/preprints202212.0203.v1
Subject: Arts & Humanities, Art History & Restoration Keywords: George Viau; collection; auction; art market; Impressionists; econometrics; quantitative art history; computational history
Online: 12 December 2022 (13:15:17 CET)
This paper analyzes the collection of George Viau (1855-1939) from a computational perspective. Indeed, this dental surgeon, close to the Impressionists, collected several hundred works of art in the first half of the 20th century: the auctions of his collection, in 1907, 1930 and after his death, in 1942, 1943 and 1948, accounted for 642 artworks and, during the Occupation, the 1942 sale produced a total of more than 46 million francs. Thanks to statistics, econometrics, network analysis and cartography, it is possible to understand the salient features of the collection and its place in the world of Parisian auctions.
CONCEPT PAPER | doi:10.20944/preprints202009.0744.v1
Subject: Keywords: Myalgic Encephalomyelitis/Chronic Fatigue Syndrome (ME/CFS), Data Collection Standardisation, Research Guidelines, Europe
Online: 30 September 2020 (12:24:58 CEST)
The European Network on Myalgic Encephalomyelitis/Chronic Fatigue Syndrome (EUROMENE) was established after a successful grant application to the European Cooperation is Science and Technology (COST). This network aimed to assess the existing knowledge and/or experience on health care delivery for people with Myalgic Encephalomyelitis/Chronic Fatigue Syndrome (ME/CFS) in the European countries and worldwide, and to enhance coordinated research and health care provision in this field. The EUROMENE proposal, was based on the establishment of interrelated working groups (WGs), where the participants contributed with specific knowledge and viewpoints according to their specialties and/or areas of interest. In this paper we outline the work of a multidisciplinary team of researchers, including epidemiologists, clinicians, statisticians, biomedical scientist and heath economists, who set out their recommendations to guide data acquisition for ME/CFS research, aiming to standardise data collection and improve epidemiological research.
ARTICLE | doi:10.20944/preprints202009.0574.v1
Subject: Earth Sciences, Other Keywords: land cover; land use; citizen science; mobile apps; in-situ data collection; LUCAS
Online: 24 September 2020 (08:26:29 CEST)
There are many new land use and land cover (LULC) products emerging yet there is still a lack of in-situ data for training, validation, and change detection purposes. The LUCAS (Land Use Cover Area frame Sample) survey is one of the few authoritative in-situ field campaigns, which takes place every three years in European Union member countries. More recently, a study has considered whether citizen science and crowdsourcing could complement LUCAS survey data, e.g., through the FotoQuest Austria mobile app and crowdsourcing campaign. Although the data obtained from the campaign were promising when compared with authoritative LUCAS survey data, there were classes that were not well classified by the citizens, and the photographs submitted through the app were not always of sufficient quality. For this reason, in the latest FotoQuest Go Europe 2018 campaign, several improvements were made to the app to facilitate interaction with the citizens contributing and to improve their accuracy in LULC identification. In addition to extending the locations from Austria to Europe, a change detection component (comparing land cover in 2018 to the 2015 LUCAS photographs) was added, as well as an improved LC decision tree and a near real-time quality assurance system to provide feedback on the distance to the target location, the LULC classes chosen and the quality of the photographs. Another modification was the implementation of a monetary incentive scheme in which users received between 1 to 3 Euros for each successfully completed quest of sufficient quality. The purpose of this paper is to present these new features and to compare the results obtained by the citizens with authoritative LUCAS data from 2018 in terms of LULC and change in LC. We also compared the results between the FotoQuest campaigns in 2015 and 2018 and found a significant improvement in 2018, i.e., a much higher match of LC between FotoQuest Go Europe and LUCAS. Finally, we present the results from a user survey to discuss challenges encountered during the campaign and what further improvements could be made in the future, including better in-app navigation and offline maps, making FotoQuest a model for enabling the collection of large amounts of land cover data at a low cost.
ARTICLE | doi:10.20944/preprints201801.0132.v1
Subject: Engineering, Civil Engineering Keywords: clay ball; asphalt pavement; pattern and density; infrared image collection system; field core
Online: 16 January 2018 (05:00:02 CET)
Clay ball is a pavement surface defect which refers to a clump in which clay or dirt is mixed with hot asphalt mixture. Clay ball is typically caused by a combination of aggregate contamination of clay or soil, high aggregate moisture, and low production temperature at the asphalt plant. It usually appears a few weeks or months after paving under traffic load, after being liquefied and knocked from the pavement surface. Clay balls can be the source of potholing, raveling, and other issues such as moisture infiltration and reduced ride quality. This paper presents an investigation of the clay balls on US-31 one winter after construction in Hamilton County, Indiana. In order to understand the pavement condition, their severity was measured using both visual observation and infrared image collection system. In addition, a clay ball distribution pattern, its density, and cores condition were evaluated. A precipitation effect on clay ball formation was investigated for finding a cause of the clay balls. The investigation found that infrared image collection system was appropriate in detecting the clay balls. The clay balls were elliptic in shape with 1 inch to 4 inches in diameter, and the maximum clay ball depth is almost penetrating the entire surface course. It was also found that the asphalt paving on the raining days or right after raining could increase the potential of clay balls. Monitoring of aggregate moisture during construction on or after raining days should be able to reduce the risk of clay balls.
ARTICLE | doi:10.20944/preprints201711.0128.v1
Subject: Mathematics & Computer Science, General & Theoretical Computer Science Keywords: metaheuristics; VNS; quantum inspired; qGVNS; optimization; TSP; routing algorithms; GPS application; garbage collection
Online: 20 November 2017 (11:37:18 CET)
General Variable Neighborhood Search (GVNS) is a well known and widely used metaheuristic for efficiently solving many NP-hard combinatorial optimization problems. Quantum General Variable Neighborhood Search (qGVNS) is a novel, quantum inspired extension of the conventional GVNS. Its quantum nature derives from the fact that it takes advantage and incorporates tools and techniques from the field of quantum computation. Travelling Salesman Problem (TSP) is a well known NP-Hard problem which has broadly been used for modelling many real life routing cases. As a consequence, TSP can be used as a basis for modelling and finding routes for Geographical Systems (GPS). In this paper, we examine the potential use of this method for the GPS system of garbage trucks. Specifically, we provide a thorough presentation of our method accompanied with extensive computational results. The experimental data accumulated on a plethora of symmetric TSP instances (symmetric in order to faithfully simulate GPS problems), which are shown in a series of figures and tables, allow us to conclude that the novel qGVNS algorithm can provide an efficient solution for this type of geographical problems.
ARTICLE | doi:10.20944/preprints201610.0037.v1
Subject: Engineering, Industrial & Manufacturing Engineering Keywords: collection-distribution center; closed loop supply chain; fuzzy random variable; particle swarm optimization
Online: 11 October 2016 (11:02:47 CEST)
Recycling waste products is an environmental-friendly activity that can bring benefits to accompany, saving manufacturing costs and improving economic efficiency. For the beer industry, recycling bottles can reduce manufacturing costs and reduce the industry's carbon footprint. This paper presents a model for a multi-objective collection-distribution center location and allocation problem in a closed loop supply chain for the beer industry, in which the objective is to minimize total costs and transportation pollution. Uncertainties in the form of randomness and fuzziness are jointly handled in this paper to ensure a more practical problem solution, for which returned bottle sand unusable bottles are considered fuzzy random variables. A heuristic algorithm based on priority-based global-local-neighbor particle swarm optimization (pb-glnPSO) is applied to ensure reliable solutions for this NP-hard problem. A case study on a beer operation company is conducted to illustrate the application of the proposed model and demonstrate the priority-based global-local-neighbor particle swarm optimization.
ARTICLE | doi:10.20944/preprints201806.0185.v1
Subject: Medicine & Pharmacology, Nursing & Health Studies Keywords: mHealth; mobile data collection; data quality; data quality assessment framework; Tuberculosis control; developing countries
Online: 12 June 2018 (10:34:33 CEST)
Background Increasingly, healthcare organizations are using technology for the efficient management of data. The aim of this study was to compare the data quality of digital records with the quality of the corresponding paper-based records by using data quality assessment framework. Methodology We conducted a desk review of paper-based and digital records over the study duration from April 2016 to July 2016 at six enrolled TB clinics. We input all data fields of the patient treatment (TB01) card into a spreadsheet-based template to undertake a field-to-field comparison of the shared fields between TB01 and digital data. Findings A total of 117 TB01 cards were prepared at six enrolled sites, whereas just 50% of the records (n=59; 59 out of 117 TB01 cards) were digitized. There were 1,239 comparable data fields, out of which 65% (n=803) were correctly matched between paper based and digital records. However, 35% of the data fields (n=436) had anomalies, either in paper-based records or in digital records. 1.9 data quality issues were calculated per digital patient record, whereas it was 2.1 issues per record for paper-based record. Based on the analysis of valid data quality issues, it was found that there were more data quality issues in paper-based records (n=123) than in digital records (n=110). Conclusion There were fewer data quality issues in digital records as compared to the corresponding paper-based records. Greater use of mobile data capture and continued use of the data quality assessment framework can deliver more meaningful information for decision making.
ARTICLE | doi:10.20944/preprints202012.0624.v1
Subject: Social Sciences, Library & Information Science Keywords: COVID-19; SARS-CoV-2; Coronavirus; Bibliometrics; Outbreak; pandemic; Web of Science Core Collection; WoS
Online: 24 December 2020 (13:34:07 CET)
COVID-19 breakout calls for immediate research explorations. The objective of this study is to perform a bibliometric analysis of all COVID-19-related publications in Science Citation Index Expanded (SCI-EXPANDED) in the early stage of the outbreak. Analysis parameters include performances of authors, institutes, and countries as well as distributions of Web of Science categories, journals, languages, and types of publications. Results show that 32% of total papers were published as editorial materials and an overwhelming production from Chinese research institutes. An association of research indexes with the number of cases was also found.
ARTICLE | doi:10.20944/preprints201910.0032.v1
Subject: Mathematics & Computer Science, Information Technology & Data Management Keywords: computerized revenue collection; machine learning; cyber security; software defined networks; object-oriented programming; online database management
Online: 3 October 2019 (01:45:11 CEST)
The need for the most accurate and flexible system of revenue collection from internal sources has become a matter of extreme urgency and importance in e-governance. This need underscores the eagerness on the part of the Government to look for a new principle and policy of revenue collection or to become aggressive and innovative in the mode of collecting revenue from existing sources using the present system. The Boards of some Governments in Africa, even up to the moment are facing a lot of setbacks in performing their tasks due to the manual system of revenue collection from the public. This can be improved through an effective collection of revenue using the most accurate and flexible system. Tax is usually collected in the form of specific sales tax, general sales tax, corporate income tax, individual income tax, property tax and inheritance tax. Problems such as high cost of collection, fraud, underpayment, leakage in revenue, poor access to information, poor tracking of defaulters is at the increase. As a result of this, there is need to computerize the revenue collection system. Computerized systems have proven to introduce massive efficiencies and quick collection of revenue from the public. This research work demonstrates how to design and implement an automated system of revenue collection and how to maintain a secured database for collected tax information. This research delves into the study of how machine learning algorithms and Software-defined Networks improve the security of such automated systems.
Subject: Earth Sciences, Oceanography Keywords: radionuclide; tracer; data collection; antimony 125 (125Sb), tritium (3H), dispersion; modelling; English Channel; North Sea; Biscay Bay
Online: 24 January 2020 (16:03:50 CET)
Significant amounts of anthropogenic radionuclides were introduced in ocean waters following nuclear atmospheric tests and development of the nuclear industry. Dispersion of artificial dissolved radionuclides has been extensively measured for decades over the European continental shelf. The radionuclide measurement and release fluxes databases provided here represent an exceptional opportunity to validate dispersion hydrodynamic models. MARS hydrodynamic model have been applied at different scales to reproduce in realistic conditions the measured dispersion. Specific methods have been developed to obtain qualitative and quantitative results and perform model/measurement comparisons. Model validation concerns short to large scales with dedicated surveys following the dispersion: it was performed within two and three dimensions framework and from minutes and hours following a release up to several years. Results are presented concerning the dispersion of radionuclides in marine systems deduced from standalone measurements, or according to model comparisons. It allows characterising dispersion over the continental shelf, pathways, transit times, budgets and source terms. This review exhibits the main features retained from the point of view of radiotracers, hydrodynamic models and model/measurement methods with perspectives of applications in other areas or oceanographic domains.
CONCEPT PAPER | doi:10.20944/preprints201911.0178.v1
Subject: Biology, Ecology Keywords: ecological monitoring methods; vegetation composition; vegetation cover; vegetation structure; soil sampling methods; sample management; electronic data collection
Online: 15 November 2019 (08:56:27 CET)
Ecosystem surveillance monitoring is critical to managing natural resources and especially so under changing environments. Despite this importance, the design and implementation of monitoring programs across large temporal and spatial scales has been hampered by the lack of appropriately standardised methods and data streams. To address this gap, we outline a surveillance monitoring method based on permanent plots and voucher samples suited to rangeland environments around the world that is repeatable, cost-effective, appropriate for large-scale comparisons and adaptable to other global biomes. The method provides comprehensive data on vegetation composition and structure along with soil attributes relevant to plant growth, delivered as a combination of modules that can be targeted for different purposes or available resources. Plots are located in a stratified design across vegetation units, landforms and climates to enhance continental and global comparisons. Changes are investigated through revisits. Vegetation is measured to inform on composition, cover and structure. Samples of vegetation and soils are collected and tracked by barcode labels and stored long-term for subsequent analysis. Technology is used to enhance the accuracy of field methods, including differential GPS r plot locations, instrument based Leaf Area Index (LAI) measures, and three dimensional photo-panoramas for advanced analysis. A key feature of the method is the use of electronic field data collection to enhance data delivery into a publicly-accessible database.Our method is pragmatic, whilst still providing consistent data, information and samples on key vegetation and soil attributes. The method is operational and has been applied at more than 704 field locations across the Australian rangelands as part of the Ecosystem Surveillance program of the Terrestrial Ecosystem Research Network (TERN). The methodology enables continental analyses, and has been tested in communities broadly representative of rangelands globally, with components being applicable to other biomes. Here we also recommend the consultative process and guiding principles that drove the development of this method as an approach for development of the method into other biomes. The consistent, standardised and objective method enables continental, and potentially global analyses than were not previously possible with disparate programs and datasets.
ARTICLE | doi:10.20944/preprints202112.0163.v1
Subject: Social Sciences, Other Keywords: ethnobotany; paleoethnobotany; biocultural heritage; digital heritage; online database; Indigenous data sovereignty; Open Access; research accessiblity; digital reference collection
Online: 9 December 2021 (20:01:36 CET)
Biocultural heritage preservation relies on ethnobotanical knowledge and the paleoethnobotanical data used in (re)constructing histories of human-biota interactions. Biocultural heritage, defined as the knowledge and practices of Indigenous and Local peoples and their biological relatives, is often guarded information, meant for specific audiences and withheld from other social circles. As such, these forms of heritage and knowledge must also be included in the ongoing data sovereignty discussions and movement. In this paper we share the process and design decisions behind creating an online database for ethnobotanical knowledge and associated paleoethnobotanical data, using a content management system designed to foreground Indigenous and local perspectives. Our main purpose is to suggest the Mukurtu content management system, originally designed for physical items of cultural importance, be considered as a potential tool for digitizing and ethically circulating biocultural heritage, including paleoethnobotanical resources. With this database, we aim to create access to biocultural heritage and paleoethnobotanical considerations for a variety of audiences while also respecting the protected and sensitive natures of Indigenous and local knowledges.
ARTICLE | doi:10.20944/preprints202104.0012.v1
Subject: Engineering, Automotive Engineering Keywords: GIS in solid waste collection; waste vehicle routing; ArcGIS Network Analyst; waste bin allocation; municipal solid waste management
Online: 1 April 2021 (11:04:58 CEST)
Vehicle routing is a critical factor in municipal solid waste (MSW) collection planning and operations. Poor routing can introduce inefficiencies and cause targeted levels of services or performance to be missed irrespective of the level of resource application. Trial and error approaches have been proven to be not the best in the planning and prediction of expected performance. This study explores various Geographic Information System (GIS) tools and analysis techniques, and how they can be applied to optimizing vehicle routes in light of challenging site conditions. Using Adentan West residential area, suburb of Accra Ghana as a case study, current performance of the trial and error method was measured and a GIS computer model was used to evaluate various optimization scenarios to determine the level of savings that can be made. Field measurements were taking with Global Positioning System (GPS) devices for waste collection activities in areas with varying characteristics and conditions, and data analysed for one selected vehicle operating four days per week. It was found that, for a scenario where only the bin collection order was optimized while route selection was restricted by the ArcGIS Network Analyst, 2.6% of travel distance and 2.21% of travel time were saved. For the second scenario where only the route selection was optimized while order of bin collection was restricted, 4.1% and 1.5% of travel distance and time respectively were saved. For a third scenario where both the order of collection and route selection were together optimized, 10.9% and 3.7% of travel distance and time respectively were saved. Lastly, by regrouping all the bins for daily collection, 4.5% and 1.2% of travel distance and time respectively were saved. The results demonstrated that there is always room for optimization of solid waste collection routing irrespective of site constraints and other challenges that the nature of bin distribution pose to drivers. In developing countries like Ghana, where there is high demand for services in the face of limited road network access, application of GIS in optimization of routes will guide providers in planning and subsequently make more savings in fuel consumption, vehicle maintenance and cost of man-hours.
ARTICLE | doi:10.20944/preprints202003.0109.v1
Subject: Mathematics & Computer Science, Information Technology & Data Management Keywords: Automated Fare Collection (AFC); Smart Card; Crowding; Practical Waiting Area; Subway Station Platform; Time-Varying; Late-Night Peak
Online: 6 March 2020 (09:02:01 CET)
Management of crowding at subway platform is essential to improving services, preventing train delays and ensuring passenger safety. Establishing effective measures to mitigate crowding at platform requires accurate estimation of actual crowding levels. At present, there are temporal and spatial constraints since subway platform crowding is assessed only at certain locations, done every 1~2 years, and counting is performed manually Notwithstanding, data from smart cards is considered real-time big data that is generated 24 hours a day and thus, deemed appropriate basic data for estimating crowding. This study proposes the use of smart card data in creating a model that dynamically estimates crowding. It first defines crowding as demand, which can be translated into passengers dynamically moving along a subway network. In line with this, our model also identifies the travel trajectory of individual passengers, and is able to calculate passenger flow, which concentrates and disperses at the platform, every minute. Lastly, the level of platform crowding is estimated in a way that considers the effective waiting area of each platform structure.
ARTICLE | doi:10.20944/preprints201811.0336.v1
Subject: Biology, Plant Sciences Keywords: pasmo resistance, pasmo severity, quantitative trait loci (QTL), quantitative trait nucleotides (QTNs), fiber, linseed, core collection, flax, Linum usitatissimum
Online: 14 November 2018 (10:45:20 CET)
Pasmo is one of the most widespread diseases threatening flax production. To identify genetic regions associated with pasmo resistance (PR), a genome-wide association study was performed on 370 accessions from the flax core collection. Evaluation of pasmo severity was performed in the field from 2012 to 2016 in Morden, MB, Canada. Genotyping-by-sequencing has identified 258,873 single nucleotide polymorphisms (SNPs) distributed on all 15 flax chromosomes. Marker-trait associations were identified using ten different statistical models. A total of 692 unique quantitative trait nucleotides (QTNs) associated with 500 putative quantitative trait loci (QTL) were detected from six phenotypic PR datasets (five individual years and average across years). Different QTNs were identified with various statistical models and from individual PR datasets, indicative of the complementation between analytical methods and/or genotype × environment interactions of the QTL effects. The single-locus models tended to identify large-effect QTNs while the multi-loci models were able to detect QTNs with smaller effects. Among the putative QTL, 67 had large effects (3-23%), were stable across all datasets and explained 32-64% of the total variation for PR in the various datasets. Forty-five of these QTL spanned 85 resistance gene analogs including a large toll interleukin receptor, nucleotide-binding site, leucine-rich repeat (TNL) type gene cluster on chromosome 8. The number of positive effect QTL (NPQTL) in accessions was significantly correlated with PR (R2=0.55), suggesting additive effects. NPQTL was also significantly associated with morphotype (R2=0.52) and major positive effect QTL were present in the fiber type accessions. The 67 large effect QTL are suited for marker-assisted selection and the 500 QTL for effective genomic prediction in PR molecular breeding.
ARTICLE | doi:10.20944/preprints201608.0232.v2
Subject: Medicine & Pharmacology, Nursing & Health Studies Keywords: mHealth; ODK scan; mobile health application; digitizing data collection; data management processes; paper-to-digital system; technology-assisted data management; treatment adherence
Online: 2 September 2016 (03:17:38 CEST)
The present grievous situation of the tuberculosis disease can be improved by efficient case management and timely follow-up evaluations. With the advent of digital technology this can be achieved by quick summarization of the patient-centric data. The aim of our study was to assess the effectiveness of the ODK Scan paper-to-digital system during testing period of three months. A sequential, explanatory mixed-method research approach was employed to elucidate technology use. Training, smartphones, application and 3G enabled SIMs were provided to the four field workers. At the beginning, baseline measures of the data management aspects were recorded and compared with endline measures to see the impact of ODK Scan. Additionally, at the end, users’ feedback was collected regarding app usability, user interface design and workflow changes. 122 patients’ records were retrieved from the server and analysed for quality. It was found that ODK Scan recognized 99.2% of multiple choice bubble responses and 79.4% of numerical digit responses correctly. However, the overall quality of the digital data was decreased in comparison to manually entered data. Using ODK Scan, a significant time reduction is observed in data aggregation and data transfer activities, however, data verification and form filling activities took more time. Interviews revealed that field workers saw value in using ODK Scan, however, they were more concerned about the time consuming aspects of the use of ODK Scan. Therefore, it is concluded that minimal disturbance in the existing workflow, continuous feedback and value additions are the important considerations for the implementing organization to ensure technology adoption and workflow improvements.
REVIEW | doi:10.20944/preprints202106.0714.v2
Subject: Engineering, Civil Engineering Keywords: Earthquake reconnaissance; damage assessment; data sources; data collection; fieldwork surveys; closed-circuit television videos (CCTV); remote sensing (RS); crowdsourcing platforms; social media (SM)
Online: 4 October 2021 (14:54:59 CEST)
Earthquakes are one of the most catastrophic natural phenomena. After an earthquake, earthquake reconnaissance enables effective recovery by collecting building damage data and other impacts. This paper aims to identify state-of-the-art data sources for building damage assessment and provide guidance for more efficient data collection. We have reviewed 38 articles that indicate the sources used by different authors to collect data related to damage and post-disaster recovery progress after earthquakes between 2014 and 2021. The current data collection methods have been grouped into seven categories: fieldwork or ground surveys, omnidirectional imagery (OD), terrestrial laser scanning (TLS), remote sensing (RS), crowdsourcing platforms, social media (SM) and closed-circuit television videos (CCTV). The selection of a particular data source or collection technique for earthquake reconnaissance includes different criteria depending on what questions are to be answered by this data. We conclude that modern reconnaissance missions can not rely on a single data source and that different data sources should complement each other, validate collected data, or systematically quantify the damage. The recent increase in the number of crowdsourcing and SM platforms used to source earthquake reconnaissance data demonstrates that this is likely to become an increasingly important source of data.
Subject: Social Sciences, Accounting Keywords: parcel locker; last mile delivery; home delivery; City Logistics; urban freight transport; stated preference; discrete choice modelling; consumer behaviour; e-commerce; channel choice; collection points
Online: 28 May 2021 (12:23:05 CEST)
: E-commerce sales surge represents a huge challenge for urban freight transport. Parcel lockers constitute a valid solution for addressing the challenges home deliveries imply. In fact, eliminating courier-consumer contact (also relevant for health-related issues, as made evident by COVID19 pandemic) and delivering in few predefined places might help coping with missed deliveries substantially. Furthermore, this option enables consolidated shipping and reducing delivery trip costs. This paper analyses and compares consumers’ preferences for alternative collection strategies. It investigates home delivery vs parcel locker use and forecasts their future market shares. This is performed based on both customers’ socio-economic variables and attributes characterising these alternative logistic fulfilment strategies. The case study considered rests upon a stated preference survey deployed in the city of Rome. The investigation specifically targets young people (i.e., population under 30 years) since they represent early adopters. Discrete choice models allow both quantifying the monetary value of parcel lockers attributes (i.e., willingness to pay measures) and estimating the potential demand for this innovative delivery scheme. Results show that distance and accessibility are the main choice determinants. Furthermore, there is an overall high propensity to adopt parcel lockers. This research can support policymakers when implementing such solutions.