ARTICLE | doi:10.20944/preprints202208.0280.v1
Subject: Engineering, Electrical & Electronic Engineering Keywords: software defined radio; radio link; ground plane antenna; wireless communication; internet of things
Online: 16 August 2022 (05:38:06 CEST)
A software defined radio (SDR) is a communication system that makes use of components that can be configured through software, in contrast to traditional systems where these components are variable through hardware, these radio devices are much more versatile, this article describes the factors that must be considered when implementing a communication system based on Software Defined Radios (SDR), in order to reduce the attenuation factors and thus obtain the maximum distance for a transmission of data effectively in the UHF band. The calculations made for the first Fresnel zone and for the design of the Ground Plane type antennas used in the transmission/reception stages of the x40 bladeRF platforms are also presented. The tests were carried out at the facilities of the Huarangal Nuclear Center of the Peruvian Institute of Nuclear Energy, obtaining favorable results that allow ratifying the versatility and performance of the SDRs.
ARTICLE | doi:10.20944/preprints201811.0486.v1
Subject: Mathematics & Computer Science, General & Theoretical Computer Science Keywords: Software-Defined Networking (SDN), Traffic Engineering
Online: 20 November 2018 (08:24:28 CET)
The digital society is an outcome of the Internet which has nearly made everything connected and accessible no matter where or when. Nevertheless, despite the fact that conventional IP networks are complicated and very hard to manage, they are still widely adopted. The already established policies make the network configuration/reconfiguration a complex process that reacts to errors, load, and modifications. The prevailing networks are vertically integrated which makes things more and more complicated: Data planes and control are strapped together. Software-defined networking is a model that is meant to solve this issue by splitting the vertical integration and detaching the network’s control logic from the implicit routers and switches; this could be achieved by reinforcing centralization of network control and making the network programmable. In this work, we worked to implement MPLS networks with SDN, to enhance the traffic engineering over the network, and to minimize the network delay and latency, with minimum cost using three of the different SDN networks. The experiment results showed the advantage of the proposed approach for reducing the network delay, comparing with previous studies. Where the average of network delay in our approach reaches to 3.01 milliseconds.
ARTICLE | doi:10.20944/preprints201806.0138.v1
Subject: Mathematics & Computer Science, Information Technology & Data Management Keywords: controller; industry network; open flow; software defined networking; programmable logic controller
Online: 8 June 2018 (13:35:22 CEST)
Trends such as Industrial Internet of Things (IIoT) and Industry 4.0 have increased the need to use powerfull network technologies in industrial automation. The growing communication in industrial automation is harnessing the productivity and efficiency of manufacturing and process automation with minimum human intervention. Due to the ongoing evolution of industrial networks from Fieldbus technologies to Ethernet, the new opportunity has emerged to integrate the Software Defined Networking (SDN) technique. In this paper, we provide a brief overview of SDN in the domain of industrial automation. We propose a network architecture called Software Defined Industrial Automation Network (SDIAN), with the objective of improving network scalability and efficiency. To match the specific considerations and requirements of having a deterministic system in an industrial network, we propose two solutions for flow creation: Pro-active Flow Installation Scheme (PFIS) and Hybrid Flow-Installation Scheme (HFIS). We analytically quantify the proposed solutions in alleviating the overhead incurred from the flow setup cost. The analytical model is verified through monte carlo simulations. We also evaluate the SDIAN architecture and analyze the network performance of the modified topology using an emulator called Mininet. We further list and motivate SDIAN features and in particular report on an experimental food processing plant demonstration featuring Raspberry PIs (RPIs) instead of traditional Programmable Logic Controllers (PLCs). Our demonstration exemplifies the characteristics of SDIAN.
ARTICLE | doi:10.20944/preprints201911.0113.v1
Subject: Mathematics & Computer Science, Other Keywords: software defined networking; random forest; gain ratio; gru-lstm; anova f-rfe; open flow controller; machine learning
Online: 10 November 2019 (14:27:32 CET)
Recent advancements in Software Defined Networking (SDN) makes it possible to overcome the management challenges of traditional network by logically centralizing control plane and decoupling it from forwarding plane. Through centralized controllers, SDN can prevent security breach, but it also brings in new threats and vulnerabilities. Central controller can be a single point of failure. Hence, flow-based anomaly detection system in OpenFlow Controller can secure SDN to a great extent. In this paper, we investigated two different approaches of flow-based intrusion detection system in OpenFlow Controller. The first of which is based on machine-learning algorithm where NSL-KDD dataset with feature selection ensures the accuracy of 82% with Random Forest classifier using Gain Ratio feature selection evaluator. In the later phase, the second approach is combined with Gated Recurrent Unit Long Short-Term Memory based intrusion detection model based on Deep Neural Network (DNN) where we applied an appropriate ANOVA F-Test and Recursive Feature Elimination feature selection method to improve the classifier performance and achieved an accuracy of 88%. Substantial experiments with comparative analysis clearly show that, deep learning would be a better choice for intrusion detection in OpenFlow Controller.
REVIEW | doi:10.20944/preprints202207.0281.v1
Subject: Mathematics & Computer Science, Other Keywords: Optical networks; software-defined networks; fifth-generation wireless network(5G); network service orchestrators; customer- specific requirements; Quality of service, flexibility.
Online: 19 July 2022 (07:44:11 CEST)
Optical networks offer a wide range of benefits to the telecommunication sector world- wide with its provision of higher bandwidth which leads to faster data speed, longer transmission distance, and improved latency. Currently, the complexity associated with advancements in optical networks poses problems to network flexibility, reliability, and quality of service. Over the years, many reviews and proposals have been implemented by several literatures to provide solutions for optical networks using software-defined networks and network service orchestrators. This paper reviews the significant challenges in current optical network applications, the various solutions rendered by software-defined networks as well as network service orchestration, the impediments, and gaps in these software – defined networks. This paper will go a step further to look into the various improvements and implementations of software-defined networks tailored to solve specific optical network problems. This paper further proposes a flexible orchestration architecture for software-defined networks for solving flexibility and scalability problems in optical networks. This proposal uses Open Network System (ONOS) SDN controller, leveraging on dockerisation as well as kubernetes clusterisation and orchestration. This solution presents a more flexible, reliable, customable, and higher quality of service which is an improvement upon current solutions in literature.
ARTICLE | doi:10.20944/preprints202009.0728.v3
Subject: Mathematics & Computer Science, Information Technology & Data Management Keywords: Agile Software Development; Agile Methods; Software Team Productivity; Normality; Statistical Model
Online: 29 March 2021 (11:14:51 CEST)
Agile methods promise to achieve high productivity and provide high-quality software. Agile software development is the most important approach that has spread through the world of software development over the past decade. Software team productivity measurement is essential in agile teams to increase the performance of software development. Due to the prevalence of agile methodologies and increasing competition of software development companies, software team productivity has become one of the crucial challenges for agile software companies and teams. Awareness of the level of team productivity can help them to achieve better estimation results on the time and cost of the projects. However, to measure software productivity, there is no definitive solution or approach whether in traditional and agile software development teams that lead to the occurrence of many problems in achieving a reliable definition of software productivity. Hence, this study aims to propose a statistical model to assess the team’s productivity in agile teams. A survey was conducted with forty software companies and measured the impact of six factors of the team on productivity in these companies. The results show that team effectiveness factors including inter-team relationship, quality conformance by the team, team vision, team leader, and requirements handled by the team had a significant impact on the team’s productivity. Moreover, the results also state that inter-team relations affect the most on software teams’ productivity. Finally, the model fit test showed that 80% of productivity depends on team effectiveness factors.
ARTICLE | doi:10.20944/preprints201805.0464.v1
Subject: Engineering, Other Keywords: sustainability; software sustainability; information and communication technology; software design; sustainability requirement; software sustainability analysis; software sustainability guidelines; karlskrona manifesto
Online: 31 May 2018 (09:44:28 CEST)
Like other ICT communities, sustainability in software engineering is a major research and development concerns. Current research focusses on eliciting the meanings of sustainability and proposing approaches for its engineering and integration into the mainstream software development lifecycle. However, few concrete guidelines that software designers can apply effectively are available and applicable. Such guidelines are needed for the elicitation of sustainability requirements and testing software against these guidelines. This paper introduces a sustainability design catalogue to assist software developers and managers in eliciting sustainability requirements, and then in measuring and testing software sustainability. The paper reviews the current research on sustainability in software engineering which is the grounds for the development of the catalogue. Four different case studies were analyzed using the Karlskrona manifesto on sustainability design. The output from this research paper is a software sustainability design catalogue through which a pilot framework is proposed that includes a set of sustainability goals, concepts and methods. The integration of sustainability for/in software systems requires a concrete framework that exemplifies how to apply and quantify sustainability. The paper demonstrates how the proposed software sustainability design catalogue provides a step towards this direction through a series of guidelines.
ARTICLE | doi:10.20944/preprints201809.0073.v1
Subject: Mathematics & Computer Science, Information Technology & Data Management Keywords: Software Process Analysis, Software Process Improvement, Data Prove-nance
Online: 4 September 2018 (16:30:51 CEST)
Companies have been increasing the amount of data that they collect from their systems and processes, considering the decrease in the cost of memory and storage technologies in recent years. The emergence of technologies such as Big Data, Cloud Computing, E-Science, and the growing complexity of information systems made evident that traceability and provenance are promising approaches. Provenance has been successfully used in complex domains, like health sciences, chemical industries, and scientific computing, considering that these areas require a comprehensive semantic traceability mechanism. Based on these, we investigate the use of provenance in the context of Software Process (SP) and introduce a novel approach based on provenance concepts to model and represent SP data. It addresses SP provenance data capturing, storing, new information inferencing and visualization. The main contribution of our approach is PROV-SwProcess, a provenance model to deal with the specificities of SP and its ability in supporting process managers to deal with vast amounts of execution data during the process analysis and data-driven decision-making. A set of analysis possibilities were derived from this model, using SP goals and questions. A case study was conducted in collaboration with a software development company to instantiate the PROV-SwProcess model (using the proposed approach) with real-word process data. This study showed that 87.5% of the analysis possibilities using real data was correct and can assist in decision-making, while 62.5% of them are not possible to be performed by the process manager using his currently dashboard or process management tool.
REVIEW | doi:10.20944/preprints202104.0572.v1
Subject: Medicine & Pharmacology, Allergology Keywords: Digital Smile Design; digital dentistry; dentistry software; dentistry design software
Online: 21 April 2021 (11:46:39 CEST)
Without impacting the dental sciences, breakthroughs in technology and applications could not be accomplished. In the advancement of technology and information technology, dentistry and dental materials have been fully active, so much so that they have revolutionized dental techniques. Material & methods; We want to produce the first series of articles in this review on the use of digital techniques and software, such as Smile Concept Digital. The goal is to gather all the findings on the use of this program and to highlight the fields of use. The analysis included forty-nine articles, the latter discussing the use of Digital Smile Design and the area of use. The research aims to classify the dental fields are using "digitization." Change is constant in this field and will be increasing Interest in dentistry by recommending the speed and reliability of outcomes for care planning. Conclusion: As seen in the study, the digital workflow facilitates recovery that is reliable both from an aesthetic and functional point of view. The current area of use of Digital Smile Design techniques in the different branches of medicine and dentistry as well as knowledge have emerged from this research
ARTICLE | doi:10.20944/preprints202009.0478.v1
Subject: Keywords: Software development; SDLC; Secure software development challenges; security development lifecycle
Online: 20 September 2020 (14:48:42 CEST)
The main focus of this paper is to analyze and discuss the secure software development practices currently being adopted in the industry along with their significance, as well as to identify the challenges faced by developers when undertaking measures and techniques in writing secure software. It is a well-known fact that software security has been the top priority of many software companies such as Google and Facebook to thwart attackers and protect user data in this world full of cybercriminals. Understanding how most software companies in the industry operate to ensure security helps developers to identify strengths and weaknesses in their current security frameworks. Hence, by researching into previous literature and papers that are relevant to the topic and by conducting an interview with a professional in the field, this paper provides insights on the most popular secure software development framework and practices in the world as well as problems faced by companies when adopting these practices. Several security practices and activities that are required to create secure software are discovered alongside the problems that arise when companies are trying to apply these practices. This paper also proposes a few solutions that can be used to resolve these problems, which can be easily understood and implemented by software companies to transition into a truly secure software development environment.
ARTICLE | doi:10.20944/preprints201912.0063.v1
Subject: Engineering, Other Keywords: Software Quality Metrics; closed source software; open source software; Kahane’s Approach; UCP (Use Case Points) model and William’s Models
Online: 5 December 2019 (08:37:56 CET)
The complexity of software is increasing day by day due to the increase in the size of the projects being developed. For better planning and management of large software projects, estimation of software quality is important. During the development processes, complexity metrics are used for the indication of the attributes/characteristics of the quality software. There are many studies about the effect of the complexity of the software on the cost and quality. In this study, we discussed the effects of software complexity on the quality attributes of the software for open source and closed source software. Though, the quality metrics for open and closed source software are not distinct from each other. In this paper, we comparatively analyzed the impact of complexity metrics on open source and private software. We also presented various models for the management of the project complexity such as William’s Model, Stacey’s Agreement and Certainty matrix, Kahane’s Approach and UCP Model. Quality metrics here refer to the standards for the measurement of the quality of software which contains certain attributes or characteristics of the software that are related to the quality of the software. Certain quality attributes addressed in this study are Usability, Reliability, Security, Portability, Maintainability, Efficiency, Cost, Standards and Availability, etc. Both Open source and Closed source software are evaluated on the basis of these quality attributes. This study also recommended future approaches to manage the quality of project Open source and Closed source software and specify which one of them is mostly used in the industry.
ARTICLE | doi:10.20944/preprints202008.0275.v1
Subject: Biology, Plant Sciences Keywords: transposable elements; genome annotation; software evaluation
Online: 12 August 2020 (08:07:14 CEST)
Background: Transposable elements (TEs) constitute the vast majority of all eukaryotic DNA, and display extreme diversity, with thousands of families. Given their abundance and diversity, TEs discovery and annotation becomes challengeable. At present, tools and databases have built libraries to mask TEs in genomes based on de novo- and homology-based identification strategies, but no consensus criteria about which tools should be used have been proposed. Results: In the de novo-based strategy, we compared performances of TE libraries developed by four commonly used tools, including RepeatModeler, LTR_FINDER, LTRharvest, and MITE_Hunter, by using a simulated genome as a standard control. The results showed that the performance of RepeatModeler decreased as it was combined with either LTR_FINDER or LTRharvest. Combination of RepeatModeler and MITE_Hunter showed better performance than RepeatModeler and MITE_Hunter alone. In the homology-based strategy, we evaluated different sources from a taxonomic point of view to build an accurate TE library. When we selected a library from databases to identify TEs for Arabidopsis thaliana genome, the library from a genus genetically closer to Arabidopsis achieved better performance than other genera with further genetic distance. Without the Arabidopsis, combination of top three genera closer to Arabidopsis showed better performance than combination of all genera. Conclusion: This study proposes a series of recommendations to perform an accurate TE annotation: 1) For de novo-based strategy, RepeatModeler and MITE_Hunter are suggested to build a TE library; 2) For homology-based strategy, it is recommended to use library of genus genetically close to the species rather than use combined library from all genera.
ARTICLE | doi:10.20944/preprints201912.0060.v1
Subject: Keywords: mobile app, software quality anti-patterns
Online: 5 December 2019 (04:16:35 CET)
As the time passes the modification in technology world lead to the evaluation in mobile application as well. With evaluation in mobile industry it is an open challenge for software quality researcher that how to enhance software quality to meet the needs of changes? Quality assurance play a key role in differentiating good application from bed application. With the continuous evaluation of mobile application developing process should be quick and efficient to comply with user requirements and satisfaction. While the listed requirement leads to bad design choices known as antipatterns, which in turn affect the reliability of the code. A tool based method PAPRIKA is used in the proposed re-search to identify and monitor these antipatterns together with a two-step assessment model for software quality assurance and object oriented software quality matrix.
ARTICLE | doi:10.20944/preprints201909.0238.v1
Subject: Engineering, Control & Systems Engineering Keywords: Software runtime entropy; failure prediction; indicator
Online: 20 September 2019 (10:49:11 CEST)
With the development of computer science and software engineering, software becomes more and more complex. Traditional software reliability assurance techniques including software testing and evaluation can't ensure software reliable execution after being deployed. Software failure prediction techniques based on failure indicators can predict software failures according to abnormal indicator values. The latter can be collected using runtime monitoring techniques. An essential part of this method is finding proper indicators which have strong correlation with software failures. We propose a novel type of indicators in this work named software runtime entropy, which takes both software module execution time and call times into consideration. Three common open source software, grep, flex and gzip are used as study cases for finding the relationships between the indicators and software failures. Firstly, a series of fault injection experiments are conducted on those three software respectively. The decision tree algorithm is used to train those data to build the correlation models between software runtime entropy and software failures. Several common measures in machine learning domains such as accuracy, recall rates, and F-measure are used to evaluate the models. The decision tree models can be used as failure mechanisms to assist the failure prediction work. One can examine the value of runtime entropy and make a warning report when it ranges from the normal interval to abnormal one.
ARTICLE | doi:10.20944/preprints202205.0398.v1
Subject: Mathematics & Computer Science, Information Technology & Data Management Keywords: Generative Software Development; Code Generation; Complexty Space
Online: 30 May 2022 (11:32:07 CEST)
This survey proposed an evaluation model to analyze and examine different approaches to generativity. In addition to problem domain concepts, the following concepts were used to define this model: Complexity and complexity management, and Systematics view to achieve unified and integrated insight into disparate evaluation criteria. The research's approach to the said concepts is first introduced. Then, the evaluation model is presented.
ARTICLE | doi:10.20944/preprints202101.0082.v2
Subject: Earth Sciences, Atmospheric Science Keywords: Shoreline Evolution; Open-Source Software; GIS; Modeling
Online: 19 February 2021 (09:46:48 CET)
This paper presents the validation of the End Point Rate (EPR) tool for QGIS (EPR4Q), a tool built-in QGIS Graphical Modeler to calculate the shoreline change by End Point Rate method. The EPR4Q tries to fill the gap of user-friendly and free open-source tool for shoreline analysis in Geographic Information System environment, since the most used software - Digital Shoreline Analysis System (DSAS) - although is a free extension, is suited for commercial software. Besides, the best free open-source option to calculate EPR called Analyzing Moving Boundaries Using R (AMBUR), since it is a robust and powerful tool, the complexity and heavy processes can restrict the accessibility and simple usage. The validation methodology consists of applying the EPR4Q, DSAS, and AMBUR on different examples of shorelines found in nature, extracted from the U.S. Geological Survey Open-File. The obtained results of each tool were compared with Pearson correlation coefficient. The validation results indicate that the EPR4Q tool created acquired high correlation values with DSAS and AMBUR, reaching a coefficient of 0.98 to 1.00 on linear, extensive, and non-extensive shorelines, guarantying that the EPR4Q tool is ready to be freely used by the academic, scientific, engineering, and coastal managers communities worldwide.
ARTICLE | doi:10.20944/preprints202012.0070.v1
Subject: Social Sciences, Accounting Keywords: software training; simulation workflows; SimPhoNy; Simphony-Remote
Online: 2 December 2020 (15:27:18 CET)
Hands-on type training of Integrated Computational Materials Engineering (ICME) is characterized by assisted application and combination of multiple simulation software tools and data. In this paper, we present recent experiences in establishing a cloud-based infrastructure to enable remote use of dedicated commercial and open access simulation tools during an interactive on-line training event. In the first part, we summarize the hardware and software requirements and illustrate how these have been met using cloud hardware services, a simulation platform environment, a suitable communication channel, common workspaces and more. The second part of the article focuses (i) on the requirements for suitable on-line hands-on training material and (ii) on details of some of the approaches taken. Eventually, the practical experiences made during three consecutive on-line training courses held in September 2020 with 35 nominal participants each, are discussed in detail.
ARTICLE | doi:10.20944/preprints201812.0029.v2
Subject: Mathematics & Computer Science, Information Technology & Data Management Keywords: VANET; software-defined networking; mobile edge computing
Online: 5 December 2018 (12:26:50 CET)
VANET networks are a class of peer-to-peer wireless networks that are used to organize communication between cars (V2V), cars and infrastructure (V2I) and between cars and other types of nodes (V2X). These networks are based on the DSRC, 802.11 standards and are mainly intended for organizing the exchange of various types of messages, mainly emergency ones, to prevent road accidents or alert when road accident occur, or control the priority of the driveway. Initially it was assumed that cars would only interact with each other, but later, with the advent of the concept of Internet of things (IoT). Researchers began to analyze connectivity with other devices, which in general will allow to combine various road users and other devices that can used in the creation of intelligent transport infrastructure in a single smart city management system. Infrastructure is necessary for the provision of services, monitoring and management of the VANET network. As infrastructure objects it is proposed to use stationary objects of Roadside unit (RSU). The aim of this paper is to analyze the use of mobile edge computing to decrease the load to the base station and latency between RSU clouds and provide a real experiment using software defined networking and mobile edge computing for RSU.
REVIEW | doi:10.20944/preprints201912.0061.v1
Subject: Engineering, Other Keywords: scope creep; software engineering; software project management; work breakdown structure; agile method; traditional methodology; functional point analysis; stakeholders
Online: 5 December 2019 (04:20:06 CET)
Scope, time, and cost permanently effects each other and most of Information Technology projects fails due to these three factors. Scope shifting mostly occur due to time and cost. At project start, lack of understanding of project and product scope is focal involvement that leads to unsuccessful projects. Complete software scope definition determines quality of project. Defining the customer requirement and the definite scope of project has key role for implementation of project management. The complications originates when systems are developed from impractical expectations and misunderstanding requirements. These problems are cause of many changes, occurs in system development and leads to poor scope management. Scope creep is one of the momentous prompting parameter on the success of project. The failure in manage scope creep leads for 80 percent of software projects failure. However, using agile approach the impact of scope creep on projects become insignificant. A correctly distinct scope tends us to develop a quality product, within identified plans and decided cost to the stake-holders.
ARTICLE | doi:10.20944/preprints201803.0217.v1
Subject: Mathematics & Computer Science, Information Technology & Data Management Keywords: Software protection, Privacy, Innovation and Technology, Web Services Modeling, Distributed Objects, Services Software, Cryptographic Controls, Authentication, Data Encryption
Online: 26 March 2018 (13:18:20 CEST)
This paper presents a method for a decentralised peer-to-peer software license validation system using cryptocurrency blockchain technology to ameliorate software piracy, and to provide a mechanism for software developers to protect copyrighted works. Protecting software copyright has been an issue since the late 1970's and software license validation has been a primary method employed in an attempt to minimise software piracy and protect software copyright. The method described creates an ecosystem in which the rights and privileges of participants are observed.
Subject: Mathematics & Computer Science, Numerical Analysis & Optimization Keywords: harmony search; meta-heuristic; parameter optimization; software defect prediction; just-in-time prediction; software quality assurance; maintenance; maritime transportation
Online: 31 December 2020 (09:27:46 CET)
Software is playing the most important role in recent vehicle innovation, and consequently the amount of software has been rapidly growing last decades. Safety-critical nature of ships, one sort of vehicles, makes Software Quality Assurance (SQA) has gotten to be a fundamental prerequisite. Just-In-Time Software Defect Prediction (JIT-SDP) aims to conduct software defect prediction (SDP) on commit-level code changes to achieve effective SQA resource allocation. The first case study of SDP in maritime domain reported feasible prediction performance. However, we still consider that the prediction model has still rooms for improvement since the parameters of the model are not optimized yet. Harmony Search (HS) is a widely used music-inspired meta-heuristic optimization algorithm. In this article, we demonstrated that JIT-SDP can produce the better performance of prediction by applying HS-based parameter optimization with balanced fitness value. Using two real-world datasets from the maritime software project, we obtained an optimized model that meets the performance criterion beyond baseline of previous case study throughout various defect to non-defect class imbalance ratio of datasets. Experiments with open source software also showed better recall for all datasets despite we considered balance as performance index. HS-based parameter optimized JIT-SDP can be applied to the maritime domain software with high class imbalance ratio. Finally, we expect that our research can be extended to improve performance of JIT-SDP not only in maritime domain software but also in open source software.
ARTICLE | doi:10.20944/preprints202102.0513.v1
Subject: Earth Sciences, Atmospheric Science Keywords: Sea-Level Rise; GIS; Open-Source Software; Modeling
Online: 23 February 2021 (12:39:09 CET)
Sea-level rise is a problem increasingly affecting coastal areas worldwide. The existence of Free and Open-Source Models to estimate the sea-level impact can contribute to better coastal man-agement. This study aims to develop and to validate two different models to predict the sea-level rise impact supported by Google Earth Engine (GEE) – a cloud-based platform for planetary-scale environmental data analysis. The first model is a Bathtub Model based on the uncertainty of projections of the Sea-level Rise Impact Module of TerrSet - Geospatial Monitoring and Modeling System software. The validation process performed in the Rio Grande do Sul coastal plain (S Brazil) resulted in correlations from 0.75 to 1.00. The second model uses Bruun Rule formula implemented in GEE and is capable to determine the coastline retreat of a profile through the creation of a simple vector line from topo-bathymetric data. The model shows a very high cor-relation (0.97) with a classical Bruun Rule study performed in Aveiro coast (NW Portugal). The GEE platform seems to be an important tool for coastal management. The models developed have been openly shared, enabling the continuous improvement of the code by the scientific commu-nity.
ARTICLE | doi:10.20944/preprints202102.0421.v1
Subject: Earth Sciences, Atmospheric Science Keywords: Sea-Level Rise; GIS; Open-Source Software; Modeling
Online: 18 February 2021 (13:52:49 CET)
Sea-level rise is a problem increasingly affecting coastal areas worldwide. The existence 15 of Free and Open-Source Models to estimate the sea-level impact can contribute to better coastal 16 management. This study aims to develop and to validate two different models to predict the 17 sea-level rise impact supported by Google Earth Engine (GEE) – a cloud-based platform for plan-18 etary-scale environmental data analysis. The first model is a Bathtub Model based on the uncer-19 tainty of projections of the Sea-level Rise Impact Module of TerrSet - Geospatial Monitoring and 20 Modeling System software. The validation process performed in the Rio Grande do Sul coastal 21 plain (S Brazil) resulted in correlations from 0.75 to 1.00. The second model uses Bruun Rule for-22 mula implemented in GEE and is capable to determine the coastline retreat of a profile through the 23 creation of a simple vector line from topo-bathymetric data. The model shows a very high correla-24 tion (0.97) with a classical Bruun Rule study performed in Aveiro coast (NW Portugal). The GEE 25 platform seems to be an important tool for coastal management. The models developed have been 26 openly shared, enabling the continuous improvement of the code by the scientific community.
ARTICLE | doi:10.20944/preprints202011.0418.v1
Subject: Mathematics & Computer Science, Algebra & Number Theory Keywords: Security patterns; Software patterns; Systematic literature review (SLR)
Online: 16 November 2020 (12:13:53 CET)
Security patterns encompass security-related issues in secure software system development and operations that often appear in certain contexts. Since the late 1990s about 500 security patterns have been proposed. Although the technical components are well investigated, the direction, overall picture, and barriers to implementation are not. Here, a systematic literature review of 240 papers is used to devise a taxonomy for security pattern research. Our taxonomy and the survey results should improve communications among practitioners and researchers, standardize the terminology, and increase the effectiveness of security patterns.
ARTICLE | doi:10.20944/preprints202011.0410.v1
Online: 16 November 2020 (10:39:44 CET)
DevOps is an emerging practice to be followed in the Software Development life cycle. The name DevOps indicates that it’s an integration of the Development and Operation team. It is followed to integrate the various stages of the development cycle. DevOps is an extended version of the existing Agile method. DevOps aims at continuous integration, Continuous Delivery, Continuous Improvement, faster feedback and security. This paper reviews the building blocks of DevOps, challenges in adopting DevOps, Models to improve DevOps practices and Future works on DevOps
ARTICLE | doi:10.20944/preprints202008.0681.v1
Subject: Materials Science, General Materials Science Keywords: Quantum mechanics; DFT; Pseudopotential; Total energy calculation; Software
Online: 30 August 2020 (17:37:33 CEST)
We present software on total energy calculation by quantum mechanics first principle method with a graphic user interface (GUI). Total energy calculation in this software is based on numerical analysis of time-dependent density functional (the used numerical method is finite difference time domain). QUMEC package has been equipped by common exchange-correlation energy terms with electron spin polarization calculation. With this package, users can calculate the total energy of the free particle, bulk materials, and materials with free surfaces at the atomic scale. The package is tested by several physical subjects, i.e., the surface energy of nano-LiCoO2 and diffusion constant of lithium atoms in LiNi0.5Mn1.5O4.
ARTICLE | doi:10.20944/preprints202008.0564.v1
Subject: Chemistry, Electrochemistry Keywords: Li-ion battery; computer simulation; numerical method; software
Online: 26 August 2020 (07:45:40 CEST)
This code provides computational facilities to simulate current versus time during the charging of Li-ion cells at desire constant voltage by considering multiscale physical phenomena. This code only considers a powder of active materials (at microscale or nanoscale) and a small part of electrolyte around it as a half cell. Then it is extended to a complete cell by applying correct boundary conditions. This code is very useful by modifying code parameters to understand the effect of the complex shape of active materials powder (surface area and powder size), kind of electrolyte, and the applied voltages on the charging response of Li-ion cell. As a summary, a microscale approach to the design of Li-ion cells has been provided via this code.
ARTICLE | doi:10.20944/preprints202007.0001.v1
Subject: Mathematics & Computer Science, Probability And Statistics Keywords: COVID-19; Disease Modelling; SIR Model; R software
Online: 1 July 2020 (08:40:40 CEST)
The crux of the paper is to present a detailed analysis of COVID-19 data which is available on global basis. This analysis is performed using some specific package of R software. It provides various insights from the data and help to understand the current status of this pandemic in India so that effective measures can be formulated by policymakers. These insights include global summary of this disease, growth rate of this pandemic and performance of SIR model for the given global data. The analysis has been presented in different tables and graphs to understand the outputs of the problem in a more detailed point of view.
ARTICLE | doi:10.20944/preprints202005.0207.v2
Subject: Mathematics & Computer Science, Information Technology & Data Management Keywords: Software Development; Citizen Programming; JSON Schema; Data Engineering
Online: 28 May 2020 (03:04:35 CEST)
A novel software engineering platform called the Dynamic Nuchwezi Architecture Platform (DNAP) is introduced, specified and its novelties explained. The unique features of this platform are explained and several new concepts and abstractions upon which its implementation, usage, and analysis are hinged also elaborately discussed. The motivations for this new approach to building especially tools used in data engineering are spelled out and the platform is contrasted against other existing technologies of a similar kind. Finally, it is shown what known limitations DNAP suffers, as well as what room for further research and improvement there is in this field.
Subject: Life Sciences, Biochemistry Keywords: molecular graphics; protein visualization; software tools; virtual reality
Online: 12 January 2020 (16:26:54 CET)
Molecular visualisation is fundamental in the current scientific literature, textbooks and dissemination materials, forming an essential support for presenting results, reasoning on and formulating hypotheses related to molecular structure. Visual exploration has become easily accessible on a broad variety of platforms thanks to advanced software tools that render a great service to the scientific community. These tools are often developed across disciplines bridging computer science, biology and chemistry. Here we first describe a few Swiss Army knives geared towards protein visualisation for everyday use with an existing large user base, then focus on more specialised tools for peculiar needs that are not yet as broadly known. Our selection is by no means exhaustive, but reflects a diverse snapshot of scenarios that we consider informative for the reader. We end with an account of future trends and perspectives.
ARTICLE | doi:10.20944/preprints201810.0141.v1
Subject: Engineering, Industrial & Manufacturing Engineering Keywords: queuing problem; TRIZ; Arena software; average waiting time
Online: 8 October 2018 (11:28:17 CEST)
A university canteen is a queueing system characterised by non-stationary time of arrival with limited resources where the arrival rate is time dependent and has different pattern of arrival for different time interval. This means at certain time of the day, the arrival rate is much higher than other time and for a university canteen, the arrival rate of customer during the lunchtime is much higher and the food (resources) is limited. Non-stationary time dependent queueing system is not easily modelled mathematically hence such queueing systems are modelled using simulation tools such as ARENA. In order to model a non-stationary time dependent queueing system with limited resources and solve queueing problems using ARENA, researchers have to depend on their knowledge and experience in identifying the appropriate and relevant parameters for the system and make modifications to these parameters of the system to solve queueing problems by means of trial and error. Hence, this research work explores the potentials of applying a systematic problem solving tool, TRIZ to help users to make better decisions in deriving solutions to improve a non-stationary time dependent queueing system with limited resources. A case study was carried out to minimize the waiting time of the customers at the cafeteria of the Faculty of Engineering, Universiti Putra Malaysia (UPM), which has queueing problems for years during lunchtime. TRIZ was applied in this case study and the results showed that TRIZ can assist researchers to derive a solution model that lead to shorter waiting time without incurring additional cost and resources.
REVIEW | doi:10.20944/preprints201810.0059.v1
Subject: Mathematics & Computer Science, Other Keywords: code smells; code fault-proneness; bugs; software evolution
Online: 3 October 2018 (14:56:48 CEST)
Context: Code smells are associated with poor design and programming style that often degrades code quality and hampers code comprehensibility and maintainability. Goal: Identify reports from the literature that provide evidence of the influence of code smells on the occurrence of software bugs. Method: We conducted a Systematic Literature Review (SLR) to reach the~stated goal. Results: The SLR includes selected studies from July 2007 to September 2017 which analyzed the source code for open source and proprietary projects, as well, as several code smells and anti-patterns. The results of this SLR show that 24 code smells are more influential in the occurrence of bugs according to 16 studies. In contrast, three studies reported that at least 6 code smells are less influential in such occurrences. Evidence from the selected studies also point out tools, techniques and procedures applied to analyze the influence. Conclusion: To the best of our knowledge, this is the first SLR to target this goal. This study provides an up-to-date and structured understanding of the influence of code smells on the occurrence of software bugs based on findings systematically collected from a list of relevant references in the latest decade.
TECHNICAL NOTE | doi:10.20944/preprints201608.0180.v1
Subject: Engineering, Electrical & Electronic Engineering Keywords: UAV; Drone; monitoring; Multisensor; platform; software framework; beacons
Online: 19 August 2016 (10:42:58 CEST)
This paper present a platform for airborne sensor applications using low-cost, open-source components carried by an easy-to-fly unmanned aircraft vehicle (UAV). The system, available in open-source , is designed for researchers, students and makers for a broad range of their exploration and data-collection needs. The main contribution is the extensible architecture for modularized airborne sensor deployment and real-time data visualisation. Our open-source Android application provides data collection, flight path definition and map tools. Total cost of the system is below 800 dollars. The flexibility of the system are illustrated by mapping the location of Bluetooth beacons (iBeacons) on a ground field and by measuring water temperatures in a lake.
REVIEW | doi:10.20944/preprints202207.0190.v1
Subject: Mathematics & Computer Science, Information Technology & Data Management Keywords: cloud computing; data storage; users; service provider; software; hardware
Online: 13 July 2022 (04:52:59 CEST)
The popularity of cloud computing is growing owing to its large data storage capacity and high computation power. It provides online, on-demand, scalable application solution, removes hardware and software barriers for non-specialist, rapidly integrates and deploys desired and necessary facilities, supports quick upgrading and addition of features. Users get benefitted with the selection of the appropriate cloud computing platform for their projects. Here, our paper provides a comprehensive overview of the services provided to the users by the most common cloud computing service providers. This paper could be used as a reference while selecting the best service provider based on the requirements of the projects.
ARTICLE | doi:10.20944/preprints202107.0622.v1
Subject: Keywords: Monte Carlo Tree Search, Software Design, Markov Decision Process
Online: 28 July 2021 (10:29:08 CEST)
Flexible implementations of Monte Carlo Tree Search (MCTS), combined with domain specific knowledge and hybridization with other search algorithms, can be a very powerful for the solution of problems in complex planning. We introduce mctreesearch4j, a standard MCTS implementation written as a standard JVM library following key design principles of object oriented programming. We define key class abstractions allowing the MCTS library to flexibly adapt to any well defined Markov Decision Process or turn-based adversarial game. Furthermore, our library is designed to be modular and extensible, utilizing class inheritance and generic typing to standardize custom algorithm definitions. We demon- strate that the design of the MCTS implementation provides ease of adaptation for unique heuristics and customization across varying Markov Decision Process (MDP) domains. In addition, the implementation is reasonably performant and accurate for standard MDP’s. In addition, via the implementation of mctreesearch4j, the nuances of different types of MCTS algorithms are discussed.
ARTICLE | doi:10.20944/preprints202104.0721.v1
Online: 27 April 2021 (12:52:15 CEST)
The success of a software product depends on several factors. Given that different organizations and institutions use software products, the need to have a quality and desirable software according to the goals and needs of the organization makes measuring the quality of software products an important issue for most organizations and institutions. To be sure of having the right software. It is necessary to use a standard quality model to examine the features and sub-features for a detailed and principled study in the quality discussion. In this study, the quality of Word software was measured. Considering the importance of software quality and to have a good and usable software in terms of quality and measuring the quality of software during the study, experts and skilled in this field were used and the impact of each factor and quality characteristics. It was applied at different levels according to their opinion to make the result of measuring the quality of Word software more accurate and closer to reality. In this research, the quality of the software product is measured based on the fuzzy inference system in ISO standard. According to the results obtained in this study, it is understood that quality is a continuous and hierarchical concept and the quality of each part of the software at any stage of production can lead to high quality products.
ARTICLE | doi:10.20944/preprints202004.0306.v1
Subject: Mathematics & Computer Science, Information Technology & Data Management Keywords: requirements planning; search-based software engineering; verbal decision analysis
Online: 17 April 2020 (17:10:16 CEST)
In the software development process, the decision-maker (DM) has a range of problems inherent to its function. Wrong choices during software planning can bring great risk to the project. Therefore, the planning of software releases to be delivered to the customer should be well done. This is not an easy task because releases are made up of many requirements that contain complex variables that must be considered, such as precedence, cost, requirement stability, among other features that make the requirements-selection process challenging. To make this process less exhaustive, DM can use tools that facilitate this work. In software engineering, we can find fields of research specialists in this context, such as Search-Based Software Engineering (SBSE). The SBSE makes use of advanced metaheuristics to search for optimal solutions or the closest to it. In this work, we try to use another field of research to solve this same problem type, the Verbal Decision Analysis (VDA). To do this, we elaborate a workflow that will use the same source data, execute two solutions using the two search fields (SBSE and VDA) and compare the results. In the end, we evaluated and commented on the results.
REVIEW | doi:10.20944/preprints201912.0145.v1
Subject: Engineering, Other Keywords: Requirement Change Management; Methodology; Change Management Process; Software System
Online: 10 December 2019 (16:41:40 CET)
During software development requirement gathering is an important phase. Requirements are the basis of software development. The success or failure of any software depends upon level of understanding developed in requirements. During software development requirements keeps on changing due to different reasons. Hence requirements are such a critical phase that leads to the total project failure. So, to understand the impacts and to identify the conflicts with existing requirements, it is important to manage and analyze the requirements well. Requirement change management is the interest of this paper. Different requirement change management techniques has been discussed in this paper and analyzed them well and finally conclude the results accordingly.
ARTICLE | doi:10.20944/preprints201811.0552.v1
Subject: Mathematics & Computer Science, Probability And Statistics Keywords: Additive Outliers, Models, Simulation, Time Series length, R Software
Online: 22 November 2018 (14:56:57 CET)
It is a common practice to detect outliers in a financial time series in order to avoid the adverse effect of additive outliers. This paper investigated the performance of GARCH family models (sGARCH; gjrGARCH; iGARCH; TGARCH and NGARCH) in the presence of different sizes of outliers (small, medium and large) for different time series lengths (250, 500, 750, 1000, 1250 and 1500) using root mean square error (RMSE) and mean absolute error (MAE) to adjudge the models. In a simulation iteration of 1000 times in R environment using rugarch package, results revealed that for small size of outliers, irrespective of the length of time series, iGARCH dominated, for medium size of outliers, it was sGARCH and gjrGARCH that dominated irrespective of time series length, while for large size of outliers, irrespective of time series length, gjrGARCH dominated. The study further leveled that in the presence of additive outliers on time series analysis, both RMSE and MAE increased as the time series length increased.
ARTICLE | doi:10.20944/preprints201709.0155.v1
Subject: Earth Sciences, Geoinformatics Keywords: object-oriented technique; change detection; eCognition® software; landuse
Online: 29 September 2017 (12:51:40 CEST)
This study compared two object-oriented land use change detection methods—detection after classification (DAC) and classification after detection (CAD) —based on a digital elevation model, slope data, and multi-temporal Landsat images (TM image for 2000 and ETM image for 2010). We noted that the overall accuracy of the DAC (86.42%) was much higher than that of the CAD (71.71%). However, a slight difference between the accuracies of the two methods exists for deciduous broadleaf forest, evergreen coniferous forest, mixed wood, upland, paddy, reserved land, and settlement. Owing to substantial spectrum differences, these land use types can be extracted using spectral indexes. The accuracy of DAC was much higher than that of CAD for industrial land, traffic land, green shrub, reservoir, lake, river, and channel, all of which share similar spectrums. The discrepancy was mainly because DAC can completely utilize various forms of information apart from spectrum information during a two-stage classification. In addition, the change-area boundary was not limited at first, but was adjustable in the process of classification. DAC can overcome smoothing effects to a great extent using multi-scale segmentations and multi-characters in detection. Although DAC yielded better results, it was more time-consuming (28 days) because it uses a two-stage classification approach. Conversely, CAD consumed less time (15 days). Thus, a hybrid of the two methods is recommended for application in land use change detection.
ARTICLE | doi:10.20944/preprints202104.0425.v1
Subject: Engineering, Automotive Engineering Keywords: Tolls; INTEGRATION software; microscopic traffic simulation; traveler value of time
Online: 15 April 2021 (16:52:34 CEST)
Unique analytical challenges arise when drivers, who face a route choice between a toll lane and a set of free lanes, have different values of time. The most complex situation is one in which multiple sub-populations of drivers exist, each with their own unique mean and coefficient of variation of value of time. This situation, when imbedded within a larger network cannot be tackled using existing planning models, and consequently is usually only approximated. This paper examines these different approximations, the resulting numerical solutions and the implications of these approximations on the estimate of the number of expected toll lane users. The paper also shows how this problem can be solved using a combined traffic assignment/simulation model. The first part of this paper develops an analytical formulation for solving the toll lane scenario using the “value of time” representations range from the simplest to the most complex. It is shown that one of the most critical issues is a determination of who the marginal users are of the toll lane, at each level of usage, as the perceived disutility of the last marginal toll lane user depends dynamically upon that driver’s value of time. Analytical formulations based on these different approximations are then solved numerically in the second part of the paper. These numerical solutions show that significant different lane use estimates result, depending upon the representation of value of time. Consequently, it is clear that solving this problem with the fewest approximations is both of theoretical and practical importance. The third part of the paper illustrates the solution to the toll lane problem, with each level of approximation, using a combined traffic assignment/simulation model. The simulated resulting estimates of the toll lane usage for each case matches both the relative and absolute trends found in analytical solutions. However, the solution using the assignment/simulation model is not only much faster and simpler to obtain, but is also scalable both in size and complexity. The additional complexities, that are associated with a less approximate representation of value of time, should therefore be incorporated in all future assessments of toll lane facilities, be they analyzed analytically or through simulation.
ARTICLE | doi:10.20944/preprints202104.0028.v1
Subject: Mathematics & Computer Science, Algebra & Number Theory Keywords: Software Engineering; Model; Model-Driven; Model Driven Development; MDD; MDA
Online: 1 April 2021 (14:47:55 CEST)
In Model-Driven Development (MDD), the models, their generation, and imposing changes on them (model transformation) are used for the development of software. Models provide a framework to start from the imagination and abstraction to create and accomplish the final system. Models create a slow and steady transition from whatness to howness, i.e. from the natural path of the generation of software. For supporting this path, the Logic and Functionality of software must be changeable during its evolution. Here we provide a brief introduction to the concept of Model Driven Development.
ARTICLE | doi:10.20944/preprints202004.0079.v1
Subject: Chemistry, Medicinal Chemistry Keywords: COVID-19; Nigella Sativa; 6LU7; 2GTB; molecular docking; MOE software
Online: 7 April 2020 (08:58:42 CEST)
The spread of the global COVID-19 pandemic, the lack of specific treatment and the urgent situation requires use of all resources to remedy this scourge. In the present study, using molecular docking, we identify new probable inhibitors of COVID-19 by molecules from Nigella sativa L, which is highly reputed healing herb in North African societies and both Islamic and Christian traditions. The discovery of the Mpro protease structure in COVID-19 provides a great opportunity to identify potential drug candidates for treatment. Focusing on the main proteases in CoVs (3CLpro/Mpro) (PDB ID 6LU7 and 2GTB); docking of compounds from Nigella Sativa and drugs under clinical test was performed using Molecular Operating Environment software (MOE). Nigelledine docked into 6LU7 active site gives energy complex about -6.29734373 Kcal/mol which is close to the energy score given by chloroquine (-6.2930522 Kcal/mol) and better than energy score given by hydroxychloroquine (-5.57386112 Kcal/mol) and favipiravir (-4.23310471 kcal/mol). Docking into 2GTB active site showed that α- Hederin gives energy score about-6.50204802 kcal/mol whcih is better energy score given by chloroquine (-6.20844936 kcal/mol), hydroxychloroquine (-5.51465893 kcal/mol)) and favipiravir (-4.12183571kcal/mol). Nigellidine and α- Hederin appeared to have the best potential to act as COVID-19 treatment. Further, researches are necessary to testify medicinal use of identified and to encourage preventive use of Nigella Sativa against coronavirus infection.
Subject: Engineering, Other Keywords: SACDM; SOS; SQA; key factors software quality assurance; Scrum; stakeholder
Online: 9 December 2019 (07:37:30 CET)
The main moto of this study is to examine and study on behavior of Software Quality Assurance (SQA) issues of project stakeholders in a Scrum environment and their consequences. This inductive case study identifies SQA principles relevant to Meeting User Expectations. The Stakeholders in the Scrum project having lack of Concrete Guidance on Scrum’s SQA approaches, methods, and techniques. The insufficiency of concrete guidelines in Scrum needs a management squad to develop concepts that can include implementing practices from other methodologies and wisely modifying the system structure to incorporate the practices adopted, ensuring improvement in the processes, and creating a shared ownership environment. Through explaining the incompleteness of Agile approaches with special attention to the lack of concrete instructions in Scrum, the study uses techniques to customize literature and advocate for Scrum’s versatility. The study uses strategies to adapt literature and argue for Scrum’s simplicity by illustrating the incompleteness of Agile approaches with special attention to the lack of concrete instructions in Scrum methodology.
Subject: Engineering, Automotive Engineering Keywords: software defect prediction; machine learning approach; integrated approach; Deep Forest
Online: 6 December 2019 (04:25:21 CET)
Accurate prediction of defects in software components plays a vital role in administrating the quality of the quality and efficiency of the system to be developed. So we have written a systematic literature review in order to evaluate the four main defect prediction techniques. Defect prediction paves way for the testers to find bugs and modify them in order to achieve input to output conformance. In this paper we have discussed the open issues in predicting software defects and have provided with a detailed analyzation of different methods including Machine Learning, Integrated Approach, Cross-Project and Deep Forest algorithm in order to prevent these flaws. However, it is almost impossible to rule which method is better than the other so every technique can be analyzed separately and the best technique according to the problem at hand can be used or can be altered to create hybrid technique suitable for the cause.
ARTICLE | doi:10.20944/preprints201905.0174.v2
Subject: Mathematics & Computer Science, Information Technology & Data Management Keywords: cloud computing; big data; fog computing; software-defined; networking; network management; resource management; topology.
Online: 26 February 2020 (15:34:25 CET)
Cloud infrastructure provides computing services where computing resources can be adjusted on-demand. However, the adoption of cloud infrastructures brings concerns like reliance on the service provider network, reliability, compliance for service level agreements (SLAs), etc. Software-defined networking (SDN) is a networking concept that suggests the segregation of a network’s data plane from the control plane. This concept improves networking behavior. In this paper, we present an SDN-enabled resource-aware topology framework. The proposed framework employs SLA compliance, Path Computation Element (PCE) and shares fair loading to achieve better topology features. We also present an evaluation, showcasing the potential of our framework.
ARTICLE | doi:10.20944/preprints201905.0111.v1
Subject: Materials Science, Metallurgy Keywords: pulse volume; signal noise ratio; automated ultrasonic testing; simulation software
Online: 9 May 2019 (12:47:50 CEST)
Titanium’s accelerating usage in global markets is attributable to its distinctive combination of physical and metallurgical properties. The key to best utilizing titanium is to exploit these characteristics, especially as they complement one another in a given application, rather than to just directly substitute titanium for another metal. Titanium alloy are extensively used in aerospace applications such as components in aero-engines and space shuttles, mainly due to their superior strength to weight ratio. For these demanding applications functionality and reliability of components are of great importance. To increase flight safety, higher sensitivity inspections are sought for rotating parts. Increased sensitivity can be applied at the billet stage, the forging stage, or both. Inspection of the forging geometry affords the opportunity to apply the highest sensitivity due to the shorter material paths when compared to those required for billet inspections. Forging inspection is typically performed for titanium (Ti) rotating parts with immersion inspection and fixed-focus, single-element transducers. Increased gain is required with depth because the ultrasonic beam attenuates with distance and diverges beyond the focus position that is placed near the surface. The higher gain that is applied with depth has the effect of increasing the UT noise with depth. The relationships between the UT noise, selection of the examination technique and the smallest detectable defect are presented in this material.
ARTICLE | doi:10.20944/preprints201904.0106.v1
Subject: Engineering, Other Keywords: cloud computing; security patterns; privacy patterns; software and system architecture
Online: 9 April 2019 (11:46:02 CEST)
Requirements for cloud services include security and privacy. Although many security patterns, privacy patterns, and non-pattern-based knowledge have been reported, knowing which pattern or combination of patterns to use in a specific scenario is challenging due to the sheer volume of options and the layered cloud stack. To deal with security and privacy in cloud services, this study proposes the Cloud Security and Privacy Metamodel (CSPM). CSPM uses a consistent approach to classify and support existing security and privacy patterns. In addition, CSPM is used to develop a security and privacy awareness process to develop cloud systems. The effectiveness and practicality of CSPM is demonstrated via several case studies.
ARTICLE | doi:10.20944/preprints201904.0008.v1
Subject: Earth Sciences, Geoinformatics Keywords: GRASS GIS; g.citation; software citation; open science; OSGeo; credit; rewards
Online: 1 April 2019 (10:19:53 CEST)
The authors introduce the GRASS GIS add-on module g.citation as an initial implementation of a fine-grained software citation concept. The module extends the existing citation capabilities of GRASS GIS, which until now only provide for automated citation of the software project as a whole, authored by the GRASS Development Team, without reference to individual persons. The functionalities of the new module enable individual code citation for each of the over 500 implemented functionalities, including add-on modules. Three different classes of citation output are provided in a variety human- and machine-readable formats. The implications of this reference implementation of scientific software citation for both for the GRASS GIS project and the OSGeo foundation are outlined.
ARTICLE | doi:10.20944/preprints201809.0382.v1
Subject: Biology, Forestry Keywords: forest road surface; forest road damage; vibration measurements; vibration software
Online: 19 September 2018 (10:43:25 CEST)
Regarding number of vehicles, forest roads are characterized by low traffic intensity, but on the other hand great values of ground pressure between wheels of timber truck units and forest road surface occur, often with pressures values above 80 kN which additionally causes damage of the upper and lower forest road layer. There are currently several methods for assessing condition of a forest road surface which are mainly used for assessing state of public roads, but can be used in forestry as well. Assessing condition of forest road surface was done by measuring vibrations with a specially developed software for Android OS installed on a Huawei MediaPad 7 Lite. Software measured vibrations in all three axes, coordinates of device, speed of the vehicle and time. Aim of this research was to determine accuracy of collected data so that this method can be used for scientific and practical purposes. Research was carried out on the segment of a forest road during driving a vehicle equipped with a measuring device. Tests were performed in both driving direction of the forest road segment with different measuring frequencies, tyre inflation pressures and driving speeds. Values of vibrations were classified and translated on a map of forest road together with devices’ measured coordinates. Vibration values were compared with places of recorded forest road surface damages. Research results show no significant difference in vibration values between 1 Hz and 10 Hz of measurement frequencies. Based on the analysis of collected data and obtained results, it is clear that it is possible to assess the condition of a forest road surface by measuring vibrations. The greatest values of vibrations were recorded on the most damaged parts of the forest road. Vibrations do not depend on tyre inflation pressure, but ranges of vibrations are decreasing with decreasing driving speed. Accuracy of collected data depends on GPS signal quality, so it is recommended that each segment of forest road is recorded twice so that location of damages on forest road can be confirmed with certainty.
ARTICLE | doi:10.20944/preprints202109.0138.v1
Subject: Keywords: Agaricus Bisporus; Button Mushroom; Molecular docking; PyRx software; BIOVIA Discovery studio
Online: 8 September 2021 (10:14:03 CEST)
Agaricus bisporus is belonging to family agaricaceae, which is widely acceptable and mostly cultivated among the all mushrooms. It has great nutritional values and it is rich in proteins, vitamins, carbohydrates, fibers, minerals and amino acids. It is effective in antimicrobial, anticancer, antidiabetic, antihypercholesterolemic, antihypertensive, hepatoprotective and antioxidant activities. As it is effective in anticancer property, we check the effects of chemical constituents of Agaricus Bisporus on DNA damaging protein which results its activity PARP inhibiting or vise-versa. We choose the molecular docking technique to check the effects of different chemical constituents of Agaricus Bisporus on DNA damaging protein. For that different PARP inhibitory drugs taken as the standard. We perform the molecular docking of the chemical constituents of Agaricus Bisporus, using 4UND protein with the help of PyRx software and BIOVIA Discovery studio software. Along with that PARP inhibitor drugs also run against the same protein. The results of molecular docking shows the some of the constituents of Agaricus Bisporus has better binding affinity than the standard taken PARP inhibitor drugs. The ergosterol shows the better binding affinity than the niraparib and rucaparib on the same proteins. On other hands the naringenin, quercetin, anthocyanin, folate and myricetin shows the better results than the rucaparib. That means the ergosterol shows the better results as PARP inhibitor than the niraparib and rucaparib.
ARTICLE | doi:10.20944/preprints202105.0479.v1
Subject: Engineering, Automotive Engineering Keywords: software quality, Adaptive Neural Fuzzy, ISO standard, quality model, Inference system
Online: 20 May 2021 (10:31:56 CEST)
Computer systems are involved in many critical human applications today, so that a small error can lead to serious and dangerous problems. These errors can be from an error in the incorrect design of the user interface to an error in the program code. The success of a software product depends on several factors. Given that different organizations and institutions use software products, the need to have a quality and desirable Software according to the goals and needs of the organization makes measuring the quality of software products. an important issue for most organizations and institutions, To be sure of having the right software. It is necessary to use a standard quality model to examine the features and sub-features for a detailed and principled study in the quality discussion. In this study, the quality of Word software was measured by Adaptive Neural Fuzzy Inference System. In recent years, powerful systems called fuzzy inference systems on The basis of adaptive neural network (ANFIS) has been used in various sciences. Using the power of neural network training and the linguistic advantage of fuzzy systems, these types of systems have been able to realize the advantages of the two in terms of analyzing very powerful complex processes. Considering the importance of software quality and to have a good and usable software in terms of quality and measuring the quality of software during the study. It was applied at different levels to make the result of measuring the quality of Word software more accurate and closer to reality. In this research, the quality of the software product is measured based on the adaptive neural-fuzzy inference system in ISO standard. According to the results obtained in this study, it is understood that quality is a continuous and hierarchical concept and the quality of each part of the software at any stage of production can lead to high quality products.
ARTICLE | doi:10.20944/preprints202104.0405.v1
Subject: Engineering, Automotive Engineering Keywords: extrusion; 52In-48Sn alloy; wire; lead-free solder; rod; simulation; software.
Online: 15 April 2021 (10:27:48 CEST)
In this article, a technology for producing wire and rod solder from 52In-48Sn alloy has been developed and investigated in the conditions of small-scale production. The use of direct extrusion of wire and rods instead of the traditional technology for producing solder, which includes pressing, rolling and drawing, can significantly reduce the fleet of required equipment. Using only a melting furnace and a hydraulic press, solder wires and rods can be produced in various sizes. Shortening the production cycle allows you to quickly fulfill small orders and be competitive in sales. The article developed a mathematical model of direct extrusion, which allows you to calculate: extrusion ratio, extrusion speed and pressing force. The results of modeling the process of extrusion of wire ∅2.00 mm and rods ∅8.0 mm made of 52In-48Sn alloy are presented. The temperature of the solder and the tool is simulation in software QForm based on the finite element method. Experimental results of manufacturing ∅2.0 mm solder wire and ∅8.0 mm rods are presented. The microstructure of the direct extruded solder is a eutectic of phases γ and β. Energy-dispersive X-ray spectroscopy (EDS) mapping of the 52In-48Sn alloy showed that the solder obtained by direct extrusion has a uniform distribution of structural phases. The developed technology can be used in the manufacture of wires and rods from other low-melting alloys.
ARTICLE | doi:10.20944/preprints202102.0535.v1
Subject: Engineering, Automotive Engineering Keywords: Connected vehicles; C-V2X; V2V; INTEGRATION software; traffic simulation; communication modeling
Online: 23 February 2021 (19:38:56 CET)
The transportation system has evolved into a complex cyber-physical system with the introduction of wireless communication and the emergence of connected travelers and connected automated vehicles. Such applications create an urgent need to develop high-fidelity transportation modeling tools that capture the mutual interaction of the communication and transportation systems. This paper addresses this need by developing a high-fidelity, large-scale dynamic and integrated traffic and direct cellullar vehicle-to-vehicle and vehicle-to-infrastructure (collectively known as V2X) modeling tool. The unique contributions of this work are (1) we developed a scalable analytical communication model that captures packet movement at the millisecond level; (2) we coupled the communication and traffic simulation models in real-time to develop a fully integrated dynamic connected vehicle modeling tool; and (3) we developed scalable approaches that adjust the frequency of model coupling depending on the number of concurrent vehicles in the network. The proposed scalable modeling framework is demonstrated by running on the Los Angeles downtown network considering the morning peak hour traffic demand (145,000 vehicles), running faster than real-time on a regular personal computer (1.5 hours to run 1.86 hours of simulation time). Spatiotemporal estimates of packet delivery ratios for downtown Los Angeles are presented. This novel modeling framework provides a breakthrough in the development of urgently needed tools for large-scale testing of Direct C-V2X enabled applications.
Subject: Mathematics & Computer Science, Algebra & Number Theory Keywords: containers; virtual machines; cloud; COVID-19; serverless; analytics; software defined infrastructure
Online: 19 February 2021 (11:31:42 CET)
TThe XPRIZE Foundation designs and operates multi-million-dollar, global competitions to incentivize the development of technological breakthroughs that accelerate humanity toward a better future. To combat the COVID-19 pandemic, the Foundation coordinated with several organizations to make available data sets about different facets of the disease and to provide the computational resources needed to analyze those data sets. This is paper is a case study of the requirements, design, and implementation of the XPRIZE Data Collaborative, a cloud-based infrastructure that enables the XPRIZE to meet its COVID-19 mission and host future data-centric competitions. We examine how a Cloud Native Application can use an unexpected variety of Cloud technologies, ranging from containers, serverless computing, to even older ones like Virtual Machines. We also search and document the effects that the pandemic had on application development in the Cloud. We include our experiences of having users successfully exercise the Data Collaborative, detailing the challenges encountered and areas for improvement and future work.
ARTICLE | doi:10.20944/preprints202101.0113.v1
Subject: Mathematics & Computer Science, Algebra & Number Theory Keywords: Complexity Analysis & Mitigation; Software Architecture & Design, Safety; Quality; Fragility; Failure Obviation
Online: 6 January 2021 (11:45:48 CET)
Studies have found critical software malfunctions responsible for some of the worst accidents in recent times. These malfunctions are often only minor defects that snowball into large problems; a few lines of code is all it takes. Complexity, safety, quality, and resilience are among the key attributes defining a software’s operational success. There are many leading factors for complexity, such as increases in the product size, the rate of requirement changes, and the number and type of stakeholders, and failure to manage these issues efficiently always has the same consequence, i.e., massive failure and sometimes technological catastrophe. This work analyzes some of the architecture, design, and implementation guidelines used as detection and mitigation techniques. It also discusses the safety considerations, as considering how the steam industry has handled safety issues could offer some guidance for ensuring safety. Complexity in such systems also causes some of the worst side effects from the quality auditor's perspective. While failures in the software are hard to predict, one of the most significant ways of showing preparedness is practicing software resilience. New mitigation areas, such as the fragility spectrum and failure obviation, and their usage for building a safer system are analyzed. Also discussed are various architecture styles in practice and the dramatic effect human factors can have on the success of the software being developed.
Subject: Engineering, Automotive Engineering Keywords: software project managemnet; complexity factors; PMBOX; paradigms of complexity; knowledge areas
Online: 3 December 2019 (12:00:18 CET)
Software project complexity increases day by day because the software engineering products is being used in the solution of more technically difficult problem and the size of project continuous to grow. The increase complexity causes to high numbers of software project failures in term of time, cost and quality. The main question regarding to this problem is how to handle or cope with this complexity. There is no single way to handle this, software engineer uses different perspective to handle complexity without affecting the overall project performance. A management perspective recognizes that the success of complex project requires good project management. A technically perspective reveals new paradigms for software development i.e.; object oriented and formal methods etc. and software engineer also look for automation perspective in order to reduce the complexity issues. In this paper we will find out the main software project complexity factors by focusing on the management aspects of software project development and also the problems of managing complexity in software engineering products from these different perspectives. The paper is divided in three main sections; paradigms of software development, project management in term of time, cost and quality and third one is automated support that includes methods and tools used to manage the complexity.
ARTICLE | doi:10.20944/preprints201806.0226.v1
Subject: Mathematics & Computer Science, Artificial Intelligence & Robotics Keywords: autogenous intelligence; bootstrap fallacy; recursive self-improvement; self-modifying software; singularity
Online: 14 June 2018 (08:53:23 CEST)
Toby Walsh in “The Singularity May Never Be Near” gives six arguments to support his point of view that technological singularity may happen but that it is unlikely. In this paper, we provide analysis of each one of his arguments and arrive at similar conclusions, but with more weight given to the “likely to happen” probability.
ARTICLE | doi:10.20944/preprints201710.0042.v1
Subject: Engineering, Biomedical & Chemical Engineering Keywords: free software; human motion; Kinovea; low cost; reliability; validity; video analysis
Online: 9 October 2017 (05:07:57 CEST)
Clinical rehabilitation and sports performance analysis both require the objectification of movement. Kinovea© is a free 2D motion analysis software that enables the establishment of kinematics parameters. This low-cost technology has been used in sports sciences, as well as clinical field and research work. Although it has been validated as a tool with which to assess time-related variables, this is not yet the case regarding angular and distance variables. The main objective of this study was to determine the validity and reliability of the Kinovea software in obtaining angular and distance data at different perspectives of 90°, 75°, 60° and 45°. For this purpose, a figure with 29 points was designed (in AutoCAD) and 24 frames analysed. Each frame was examined by three observers who each made two attempts. For each export data item, 20 angles and 20 distance variables were calculated, with intra- and inter-observer reliability also analysed. To evaluate Kinovea reliability and validity a multiple approach was applied involving the following analysis: -systematic error with a two-way ANOVA 2x4; -relative reliability with ICC and CV (95% confidence interval); -absolute reliability with Standard Error. The results thus obtained indicate that the Kinovea software is a valid and reliable tool that is able to measure accurately at distances up to 5 m from the object and at an angle range of 90°–45°. Nevertheless, for optimum results an angle of 90° is suggested.
ARTICLE | doi:10.20944/preprints201708.0066.v1
Subject: Engineering, Other Keywords: non-homogeneous poisson process; software reliability; weibull function; mean square error
Online: 18 August 2017 (13:05:46 CEST)
The main focus when developing software is to improve the reliability and stability of a software system. When software systems are introduced, these systems are used in field environments that are the same as or close to those used in the development-testing environment; however, they may also be used in many different locations that may differ from the environment in which they were developed and tested. In this paper, we propose a new software reliability model that takes into account the uncertainty of operating environments. The explicit mean value function solution for the proposed model is presented. Examples are presented to illustrate the goodness-of-fit of the proposed model and several existing non-homogeneous Poisson process (NHPP) models and confidence intervals of all models based on two sets of failure data collected from software applications. The results show that the proposed model fits the data more closely than other existing NHPP models to a significant extent.
ARTICLE | doi:10.20944/preprints201704.0130.v1
Subject: Engineering, Industrial & Manufacturing Engineering Keywords: intelligent robotics; flexibility; reusability; multisensor; state machine; software architecture; computer vision
Online: 20 April 2017 (04:14:33 CEST)
This paper presents a state machine based architecture which enhances flexibility and reusability of industrial robots, more concretely dual-arm multisensor robots. The proposed architecture, in addition to allowing absolute control of the execution, eases the programming of new applications by increasing the reusability of the developed modules. Through an easy-to-use graphical user interface, operators are able to create, modify, reuse and maintain industrial processes increasing the flexibility of the cell. Moreover, the proposed approach is applied in a real use case in order to demonstrate its capabilities and feasibility in industrial environments. A comparative analysis is presented for evaluating presented approach versus traditional robot programming techniques.
ARTICLE | doi:10.20944/preprints202112.0386.v1
Subject: Engineering, Civil Engineering Keywords: energy efficiency; natural building; conventional building; TRNSYS software; temperature; humidity; energy consumption
Online: 23 December 2021 (11:48:08 CET)
The construction field uses up over one-third of the global energy consumption and contribute to 40% of CO2 emissions according to the International Energy Agency (IEA) and the 2020 annual reporter of United Nation, Goal 11 (Make cities inclusive, safe, resilient and sustainable) which discusses sustainable, safe and efficient buildings. Therefore, Morocco has a commitment to this program by publishing the law 47-09 of energy efficiency. This work aims to study the energy efficiency of two types of building, a conventional and a natural building. Conventional building is constructed using concrete, while the natural one uses sand clay and straws. As for the technique of making the natural building, it perpetually follows the same approach accustomed in rural zones of Atlas Mountains in Morocco. In this research we also simulate, temperature and humidity variation inside these buildings using TRNSYS software. Sketch Up software was employed to design these houses. The weather database is used for a typical meteorological year (TMY). In the case of natural building, many building configurations were simulated: roof insulation, floor insulation, different types of glazing and sun protection. What's more, the thermal comfort is revealed to be more conspicuous in the case of natural building.
ARTICLE | doi:10.20944/preprints201911.0366.v1
Subject: Mathematics & Computer Science, Computational Mathematics Keywords: neural networks; topology; directed graphs; directed flag complexes; persistent homology; computaional software
Online: 29 November 2019 (03:03:36 CET)
We present a new computing package Flagser, designed to construct the directed flag complex of a finite directed graph, and compute persistent homology for flexibly defined filtrations on the graph and the resulting complex. The persistent homology computation part of Flagser is based on the program Ripser , but is optimised specifically for large computations. The construction of the directed flag complex is done in a way that allows easy parallelisation by arbitrarily many cores. Flagser also has the option of working with undirected graphs. For homology computations Flagser has an Approximate option, which shortens compute time with remarkable accuracy. We demonstrate the power of Flagser by applying it to the construction of the directed flag complex of digital reconstructions of brain microcircuitry by the Blue Brain Project and several other examples. In some instances we perform computation of homology. For a more complete performance analysis, we also apply Flagser to some other data collections. In all cases the hardware used in the computation, the use of memory and the compute time are recorded.
ARTICLE | doi:10.20944/preprints201807.0045.v1
Subject: Mathematics & Computer Science, Applied Mathematics Keywords: verbal decision analysis; multi-objective optimization; software release planning; ZAPROS III-i
Online: 3 July 2018 (12:24:02 CEST)
The activity of prioritizing software requirements should be done as efficiently as possible. Selecting the most stable requirements for the most important customers for the development company can be a positive factor when we consider that the available resource does not always encompass the implementation of all requirements. Quantitative methods for reaching software prioritization in releases are many in the field of Search-Based Software Engineering (SBSE). However, we show that it is possible to use qualitative Verbal Decision Analysis (VDA) methods to solve this same type of problem. Moreover, we will use the ZAPROS III-i methods to prioritize requirements considering the opinion of the decision-maker, who will participate in this process. Finally, the results obtained in the VDA structured methods were quite satisfactory when compared to the methods using SBSE. A comparison of results between quantitative and qualitative methods will be made and discussed later.
ARTICLE | doi:10.20944/preprints201608.0155.v1
Subject: Mathematics & Computer Science, General & Theoretical Computer Science Keywords: component-based software development; dependability attributes; availability; reliability; integrity; confidentiality; safety; maintainability
Online: 15 August 2016 (12:21:45 CEST)
The software industry has adopted component-based software development (CBSD) to rapidly build and deploy large and complex software systems with significant savings at minimal engineering effort, cost, and time. However, CBSD encounters issues on security trust, mainly with respect to dependability attributes. A system is considered dependable when it can produce the outputs for which it was designed with no adverse effect on its intended environment. Dependability consists of several attributes that imply availability, confidentiality, integrity, reliability, safety, and maintainability. Dependability attributes must be embedded in a CBSD model to develop dependable component software. Motivated by the importance of these attributes, this paper pursues two objectives: to design a model for developing a dependable system that mitigates the vulnerabilities of software components, and to evaluate the proposed model. The model proposed in this study is labelled as developing dependable component-based software (2DCBS). To develop this model, the CBSD architectural phases and processes must be framed and the six dependability attributes embedded according to the best practice method. The expert opinion approach was applied to evaluate 2DCBS framing. In addition, the 2DCBS model was applied to the development of an information communication technology (ICT) portal through an empirical study method. Vulnerability assessment tools (VATs) were employed to verify the dependability attributes of the developed ICT portal. Results show that the 2DCBS model can be adopted to develop web application systems and to mitigate the vulnerabilities of the developed systems. This study contributes to CBSD and facilitates the specification and evaluation of dependability attributes throughout model development. Furthermore, the reliability of the dependable model can increase confidence in the use of CBSD for industries.
ARTICLE | doi:10.20944/preprints202208.0523.v1
Subject: Mathematics & Computer Science, Other Keywords: angle-based outlier detection: percentile-based outlier detection; multiphilda, noise; irrelevant software requirements
Online: 30 August 2022 (11:25:24 CEST)
Noise in requirements has been known to be a defect in software requirements specifications (SRS). Detecting defects at an early stage is crucial in the process of software development. Noise can be in the form of irrelevant requirements that are included within a SRS. A previous study had attempted to detect noise in SRS, in which noise was considered as an outlier. However, the resulting method only demonstrated a moderate reliability due to the overshadowing of unique actor words by unique action words in the topic-word distribution. In this study, we propose a framework to identify irrelevant requirements based on the MultiPhiLDA method. The proposed framework distinguishes the topic-word distribution of actor words and action words as two separate topic-word distributions with two multinomial probability functions. Weights are used to maintain a proportional contribution of actor and action words. We also explore the use of two outlier detection methods, namely Percentile-based Outlier Detection (PBOD) and Angle-based Outlier Detection (ABOD), to distinguish irrelevant requirements from relevant requirements. The experimental results show that the proposed framework was able to exhibit better performance than previous methods. Furthermore, the use of the combination of ABOD as the outlier detection method and topic coherence as the estimation approach to determine the optimal number of topics and iterations in the proposed framework outperformed the other combinations and obtained sensitivity, specificity, F1-score, and G-mean values of 0.59, 0.65, 0.62, and 0.62, respectively.
REVIEW | doi:10.20944/preprints202107.0193.v1
Subject: Life Sciences, Biochemistry Keywords: metabolomics; plant biology; metabolomics databases; data analysis; metabolomics software tools; mass spectrometry; omics
Online: 8 July 2021 (10:46:55 CEST)
Metabolomics is now considered to be a wide-ranging, sensitive and practical approach to acquire useful information on the composition of a metabolite pool present in any organism, including plants. Investigating metabolomic regulation in plants is essential to understand their adaptation, acclimation and defense response to environmental stresses through the production of numerous metabolites. Moreover, metabolomics can be easily applied for the phenotyping of plants; and thus, it has great potential to be used in molecular breeding and genome editing programs to develop superior next generation crops. This review describes the recent analytical tools and techniques available to study plants metabolome, along with their significance of sample preparation using targeted and non-targeted method. Advanced analytical tools, like gas chromatography-mass spectrometry (GC-MS), liquid chromatography mass-spectroscopy (LC-MS), capillary electrophoresis-mass spectrometry (CE-MS), fourier transform ion cyclotron resonance-mass spectrometry (FTICR-MS) and matrix-assisted laser desorption/ionization (MALDI) have speed up metabolic profiling in plants. Further, we deliver a complete overview of bioinformatics tools and plant metabolome database that can be utilized to advance our knowledge to plant biology.
ARTICLE | doi:10.20944/preprints202102.0503.v1
Subject: Mathematics & Computer Science, Algebra & Number Theory Keywords: Machine Learning; Software Testing; Quality Attributes; Deep Learning; Model Mutation testing; DNN; DL
Online: 23 February 2021 (09:22:02 CET)
This is an article or technical note which is intended to provides an insight journey of Machine Learning Systems (MLS) testing, its evolution, current paradigm and future work. Machine Learning Models, used in critical applications such as healthcare industry, Automobile ,  and Air Traffic control, Share Trading etc., and failure of ML Model can lead to severe consequences in terms of loss of life or property. To remediate this, developers, scientists, and ML community around the world, must build a highly reliable test architecture for critical ML application. At the very foundation layer, any test model must satisfy the core testing attributes such as test properties and its components. This attribute comes from the software engineering , , but the same cannot be applied in as-is form to the ML testing and we will tell you “why”.
ARTICLE | doi:10.20944/preprints201911.0180.v1
Subject: Engineering, Electrical & Electronic Engineering Keywords: radio virtualization; software-defined radio; network densification; infrastructure sharing; multi-tenancy; cognitive radios
Online: 15 November 2019 (16:44:52 CET)
The next generation of wireless and mobile networks will have to handle a significant increase in traffic load compared to the actual one. This situation calls for novel ways to increase spectral efficiency. Therefore in this paper, we propose a wireless spectrum hypervisor architecture that abstracts a radio frequency (RF) front-end into a configurable number of virtual RF front-ends. The proposed architecture has the ability to enable flexible spectrum access in existing wireless and mobile networks, which is a challenging task due to the limited spectrum programmability, $i.e.$, the capability a system has to change the spectral properties of a given signal to fit an arbitrary frequency allocation. The main goal of the proposed approach is to improve spectral efficiency by efficiently using vacant gaps in congested spectrum-bandwidths or adopting network densification through infrastructure sharing. We demonstrate mathematically how our proposed approach works and present several simulation results proving its functionality and efficiency. Additionally, we designed and implemented an open-source and free proof of concept prototype of the proposed architecture, which can be used by researchers and developers to run experiments or extend the concept to other applications. We present several experimental results used to validate the proposed prototype. We demonstrate that the prototype can easily handle up to 12 concurrent physical layers.
ARTICLE | doi:10.20944/preprints201710.0085.v1
Subject: Earth Sciences, Atmospheric Science Keywords: volcanic gases; SO2; remote sensing; UV cameras; image processing; analysis software; Python 2.7
Online: 13 October 2017 (04:00:49 CEST)
UV SO2 cameras have become a common tool to measure and monitor SO2-emission-rates, mostly from volcanoes but also from anthropogenic sources (e.g. power plants or ships). In the past years, the analysis of UV SO2 camera data has seen many improvements. As a result, for many of the required analysis steps, several alternatives exist today. This inspired the development of Pyplis, an open-source software toolbox written in Python 2.7, which aims to unify the most prevalent methods from literature within a single, cross-platform analysis framework. Pyplis comprises a vast collection of algorithms relevant for the analysis of UV SO2 camera data. These include several routines to retrieve plume background radiances as well as routines for cell and DOAS based camera calibration. The latter includes two independent methods to identify the DOAS field-of-view within the camera images. Plume velocities can be retrieved using an optical flow algorithm as well as signal cross-correlation. Furthermore, Pyplis includes a routine to perform a first order correction of the signal dilution effect. All required geometrical calculations are performed within a 3D model environment allowing for distance retrievals to plume and local terrain features on a pixel basis. SO2-emission-rates can be retrieved simultaneously for an arbitrary number of plume intersections. Pyplis has been extensively and successfully tested using data from several field campaigns. Here, the main features are introduced using a dataset obtained at Mt. Etna, Italy on 16 September 2015.
ARTICLE | doi:10.20944/preprints201703.0196.v1
Subject: Engineering, Industrial & Manufacturing Engineering Keywords: simulation software; manufacturing systems; process integration; machining optimization; Industry 4.0; knowledge-based manufacturing
Online: 27 March 2017 (10:28:34 CEST)
The future of machine tools will be dominated by highly flexible and interconnected systems, in order to achieve the required productivity, accuracy and reliability. Nowadays, distortion and vibration problems are easily solved in labs for the most common machining operations by using models based on equations describing the physical laws of the machining processes; however additional efforts are needed to overcome the gap between scientific research and the real manufacturing problems. In fact, there is an increasing interest in developing simulation packages based on “deep-knowledge and models” that aid machine designers, production engineers or machinists to get the best of the machine-tools. This article proposes a methodology to reduce problems in machining by means of a simulation utility, which uses the main variables of the system&process as input data, and generates results that help in the proper decision-making and machining planification. Direct benefits can be found in a) the fixture/clamping optimal design, b) the machine tool configuration, c) the definition of chatter-free optimum cutting conditions and d) the right programming of cutting toolpaths at the Computer Aided Manufacturing (CAM) stage. The information and knowledge-based approach showed successful results in several local manufacturing companies and are explained in the paper.
ARTICLE | doi:10.20944/preprints201612.0106.v2
Subject: Engineering, Industrial & Manufacturing Engineering Keywords: simulation software; manufacturing systems; process integration; machining optimization; Industry 4.0; knowledge-based manufacturing
Online: 26 February 2017 (10:18:59 CET)
The next future using machine tools will be dominated by highly flexible and interconnected systems, in order to achieve the required productivity, accuracy and reliability. Nowadays, distortion and vibration problems are easily solved for the most common cases by sing models based on equations describing the physical laws dominating the machining process; however additional efforts are needed to overcome the gap between scientific research and the real manufacturing problems. In fact, there is an increasing interest in developing simulation packages based on “deep knowledge and models” that aid the machine designer, the production engineer, or machinists to get the best of their machines. This article proposes a systematic methodology to reduce problems in machining by means of a simulation utility, which recognizes, collects and uses the main variables of the system/process as input data, and generates objective results that help in the proper decision-making. Direct benefits by such an application are found in a) the fixture/clamping optimal design, b) the machine tool configuration, c) the definition of chatter free optimum cutting conditions and the right programming of cutting tool path at the Computer Aided Manufacturing (CAM) stage. The information and knowledge-based approach showed successful results in several local manufacturing companies.
ARTICLE | doi:10.20944/preprints202209.0256.v1
Subject: Mathematics & Computer Science, Artificial Intelligence & Robotics Keywords: quantum computing; genetic algorithms; Petri nets; Quantum Petri nets; software development; analysis and verification
Online: 19 September 2022 (03:41:37 CEST)
The evolutionary systems (ES) include software applications that solve problems using heuristic methods instead of the deterministic ones. The classical computing used for ES development involves random methods to improve different kinds of genomes. The mappings of these genomes lead to individuals that correspond to the searched solutions. The individual evaluations by simulations serve for the improvement of their genotypes. Quantum computations, unlike the classical computations, can describe and simulate a large set of individuals simultaneously. This feature is used to diminish the time for finding the solutions. Quantum Petri Nets (QPNs) can model dynamical systems with probabilistic features that make them appropriate for the development of ES. Some examples of ES applications using the QPNs are given to show the benefits of the current approach.
ARTICLE | doi:10.20944/preprints202206.0061.v1
Subject: Engineering, Electrical & Electronic Engineering Keywords: computational electromagnetics; numerical methods; method of moments; antennas; radiation pattern; input impedance; simulation software
Online: 6 June 2022 (04:25:19 CEST)
This paper focuses on the combination of the method of moments and the wire-grid approximation as an effective computational technique for modeling symmetrical antennas with low computational cost and quite accurate results. The criteria and conditions for the use of wire-grid surface approximation from various sources are presented together with new recommendations for modeling symmetrical antenna structures using the wire-grid approximation. These recommendations are used to calculate the characteristics of biconical and horn antennas at different frequencies. The results obtained using different grid and mesh settings are compared to those obtained analytically. Moreover, the results are compared to those obtained using the finite difference time domain numerical method, as well as the measured ones. All results are shown to be in a good agreement. The used recommendations for building a symmetrical wire-grid of those symmetrical antenna elements provided the most advantageous parameters of the grid and mesh settings and the wire radius, which are able to give a quite accurate results with low computational cost. Additionally, the known equal area rule was modified for a rectangular grid form. The obtained radiation patterns of a conductive plate using both the original rule and the modified one are compared with the electrodynamic analysis results. It is shown that the use of the modified rule is more accurate when using a rectangle grid form.
ARTICLE | doi:10.20944/preprints202107.0081.v1
Subject: Engineering, Automotive Engineering Keywords: Over-Actuated Unmanned Aerial Vehicle; Nonlinear Control Allocation; Software In10 The Loop; Threshold Time
Online: 5 July 2021 (09:30:01 CEST)
This paper presents a study on the influence of the frequency variation of a nonlinear1 control allocation technique execution, developed by the author , named by Fast Control2 Allocation (FCA) for the Quadrotor Tilt-Rotor (QTR) aircraft. Then, through Software In The3 Loop (SITL) simulation, the proposed work considers the use of Gazebo, QGroundControl, and4 Matlab applications, where different frequencies of the FCA can be implemented separated in5 Matlab, always analyzing the QTR stability conditions from the virtual environment performed in6 Gazebo. TheresultsshowedthattheFCAneedsatleast200HzoffrequencyfortheQTRsafeflight7 conditions, i. e., 2 times smaller than the main control loop frequency, 400 Hz. Lower frequencies8 than this one would case instability or crashes during the QTR Operation.
ARTICLE | doi:10.20944/preprints202107.0013.v1
Subject: Engineering, Electrical & Electronic Engineering Keywords: Authentication and Key Agreement; Internet of Things; Physical Layer Authentication, Universal Software Radio Peripheral
Online: 1 July 2021 (11:11:46 CEST)
In this paper, we propose a lightweight physical layer aided authentication and key agreement (PL-AKA) protocol in the internet of things (IoT). Conventional evolved packet system AKA (EPS-AKA) used in long-term evolution (LTE) systems may suffer from congestions in core networks by the large signaling overhead as the number of IoT devices increases. Thus, in order to alleviate the overhead, we consider a cross-layer authentication by integrating physical layer approaches to cryptography-based schemes. To demonstrate the feasibility of the PL-AKA, universal software radio peripheral (USRP) based tests are conducted as well as numerical simulations. The proposed scheme shows a significant reduction in signaling overhead compared to the conventional EPS-AKA in both simulation and experiment. Therefore, the proposed lightweight PL-AKA has the potential for practical and efficient implementation of large-scale IoT networks.
ARTICLE | doi:10.20944/preprints202008.0265.v2
Subject: Mathematics & Computer Science, Artificial Intelligence & Robotics Keywords: Ecommender system; learning to rank; Mining software repositories; Text Mining; Deep learning; Stack Overflow
Online: 4 September 2020 (11:20:33 CEST)
In software development, developers received bug reports that describe the software bug. Developers find the cause of bug through reviewing the code and reproducing the abnormal behavior that can be considered as tedious and time-consuming processes. The developers need an automated system that incorporates large domain knowledge and recommends a solution for those bugs to ease on developers rather than spending more manual efforts to fixing the bugs or waiting on Q&A websites for other users to reply to them. Stack Overflow is a popular question-answer site that is focusing on programming issues, thus we can benefit knowledge available in this rich platform. This paper, presents a survey covering the methods in the field of mining software repositories. We propose an architecture to build a recommender System using the learning to rank approach. Deep learning is used to construct a model that solve the problem of learning to rank using stack overflow data. Text mining techniques were invested to extract, evaluate and recommend the answers that have the best relevance with the solution of this bug report.
ARTICLE | doi:10.20944/preprints201906.0251.v1
Subject: Physical Sciences, Applied Physics Keywords: video microscopy, imaging, automated data acquisition, nanoparticle tracking, measurement embedded applications, open-source software
Online: 25 June 2019 (12:53:50 CEST)
We introduce PyNTA, a modular instrumentation software for live particle tracking. By using the multiprocessing library of Python and the distributed messaging library pyZMQ, PyNTA allows users to acquire images from a camera at close to maximum readout bandwidth while simultaneously performing computations on each image on a separate processing unit. This publisher/subscriber pattern generates a small overhead and leverages the multi-core capabilities of modern computers. We demonstrate capabilities of the PyNTA package on the featured application of nanoparticle tracking analysis. Real-time particle tracking on megapixel images at a rate of 50 Hz is presented. Reliable live tracking reduces the required storage capacity for particle tracking measurements by a factor of approximately 103, as compared with raw data storage, allowing for a virtually unlimited duration of measurements
ARTICLE | doi:10.20944/preprints201808.0545.v2
Subject: Engineering, Electrical & Electronic Engineering Keywords: model intercomparison; renewable energy; production cost modeling; security-constrained unit commitment; open-source software
Online: 24 December 2018 (10:55:11 CET)
Background: New open-source electric-grid planning models have the potential to improve power system planning and bring a wider range of stakeholders into the planning process for next-generation, high-renewable power systems. However, it has not yet been established whether open-source models perform similarly to the more established commercial models for power system analysis. This reduces their credibility and attractiveness to stakeholders, postponing the benefits they could offer. In this paper, we report the first model intercomparison between an open-source power system model and an established commercial production cost model. Results: We compare the open-source Switch 2.0 to GE Energy Consulting’s Multi Area Production Simulation (MAPS) for production-cost modeling, considering hourly operation under 17 scenarios of renewable energy adoption in Hawaii. We find that after configuring Switch with similar inputs to MAPS, the two models agree closely on hourly and annual production from all power sources. Comparing production gave a coefficient of determination of 0.996 across all energy sources and scenarios, indicating that the two models agree on 99.6% of the variation. For individual energy sources, the coefficient of determination was 69–100. Conclusions: Although some disagreement remains between the two models, this work indicates that Switch is a viable choice for renewable integration modeling, at least for the small power systems considered here. Although some disagreement remains between the two models, this work indicates that Switch is a viable choice for renewable integration modeling, at least for the small power systems considered here.
ARTICLE | doi:10.20944/preprints201811.0461.v1
Subject: Mathematics & Computer Science, Artificial Intelligence & Robotics Keywords: Software quality; cross-project defect prediction; multi-source; dissimilarity space; arc-cosine kernel function
Online: 19 November 2018 (11:48:50 CET)
Software defect prediction is an important means to guarantee software quality. Because there are no sufficient historical data within a project to train the classifier, cross-project defect prediction (CPDP) has been recognized as a fundamental approach. However, traditional defect prediction methods using feature attributes to represent samples, which can not avoid negative transferring, may result in poor performance model in CPDP. This paper proposes a multi-source cross-project defect prediction method based on dissimilarity space ( DM-CPDP). This method first uses the density-based clustering method to construct the prototype set with the cluster center of samples in the target set. Then, the arc-cosine kernel is used to form the dissimilarity space, and in this space the training set is obtained with the earth mover’s distance (EMD) method. For the unlabeled samples converted from the target set, the KNN algorithm is used to label those samples. Finally, we use TrAdaBoost method to establish the prediction model. The experimental results show that our approach has better performance than other traditional CPDP methods.
REVIEW | doi:10.20944/preprints202208.0235.v1
Subject: Engineering, Other Keywords: Malware; cyber security; cyber-attacks; two factor authentication; software; targeting; privacy; causes of cyber attacks
Online: 12 August 2022 (10:33:03 CEST)
Background: Cyber Security is to protect online data and software from cyber threats. These cyberattacks are typically intended to gain access to, change, or delete sensitive information; extort money from users; or disrupt regular corporate activities. It is difficult to keep up a regular follow up with new technologies so it is necessary to keep the important data safe from cyber threats. There are many types of cyber threats; malware, ransom-ware, social engineering, phishing etc. To prevent cyber-attacks one can use password manager tools like LastPass and others. People also use two factor authentication for double security on their accounts. Methods: Boards such as the National Institute of Standards and Technology (NIST) are developing frameworks to assist firms in understanding their security risks, improving cybersecurity procedures, and preventing cyber assaults. The fight against cybercrimes and attack, rganisations needed a strong base there are 5 types of cyber securities; Critical Infrastructure Security, application security, network security, cloud security and (IoT) Security. In the modern time US is highly based on computers and on different software so it is really important for US to be more conscious about the security as they get many threats almost everyday for hacking their data and accounts.Results and Conclusion: Nowadays, even small businesses rarely recover their loss from the cyber-attacks and many back-off from continuing their businesses after being target of hackers. The first cybercrime attack was recorded on 1988 by a graduate student. Now that large companies and even small businesses are aware of cyber-attacks so they try their best to take every precaution to prevent the hacking with double security and password manager tools.
ARTICLE | doi:10.20944/preprints202111.0239.v1
Subject: Engineering, Electrical & Electronic Engineering Keywords: radiology information systems; radiology education system; radiology operation software; information technology; computer-aided diagnosis system
Online: 12 November 2021 (17:06:10 CET)
In all areas of medicine, especially in radiology, computers are increasing year by year. Filmless radiology, speech recognition software, electronic application forms, and teleradiology are recent developments that have greatly improved radiologists' performance. This research explores radiology software trends, predictions, and the challenges posed by informatics and historical trend analysis. The rationale behind this research is that information technology (IT) is overgrowing almost every day. We must continuously seek new ways to apply IT to make more use of resources. Consequently, IT becomes increasingly crucial to radiology organizations' innovative thinking, workflow, and business models. This study aimed to analyze all radiology software publications in the Science Citation Index (SCI). From 1991 to July 2021, SCI was used to search for publications systematically. We have also widely used this historical method in radiology software research. The findings and discussions are base on an assessment of trends, predictions, contributions, and challenges in radiology software, and we are exploring radiology software with six evolutionary stages. The gift of this research is that radiology managers realize that the use of new information technologies is closely related to survival in a competitive environment. Radiology companies can review these new technologies to develop more innovative business models and services to improve operational deficiencies.
ARTICLE | doi:10.20944/preprints202110.0237.v1
Subject: Mathematics & Computer Science, Information Technology & Data Management Keywords: Software reliability; deep learning; long short-term memory; project similarity and clustering; cross-project prediction
Online: 18 October 2021 (10:33:39 CEST)
Software reliability is an important characteristic for ensuring the qualities of software products. Predicting the potential number of bugs from the beginning of a development project allows practitioners to make the appropriate decisions regarding testing activities. In the initial development phases, applying traditional software reliability growth models (SRGMs) with limited past data does not always provide reliable prediction result for decision making. To overcome this, herein we propose a new software reliability modeling method called deep cross-project software reliability growth model (DC-SRGM). DC-SRGM is a cross-project prediction method that uses features of previous projects’ data through project similarity. Specifically, the proposed method applies cluster-based project selection for training data source and modeling by a deep learning method. Experiments involving 15 real datasets from a company and 11 open source software datasets show that DC-SRGM can more precisely describe the reliability of ongoing development projects than existing traditional SRGMs and the LSTM model.
ARTICLE | doi:10.20944/preprints202106.0046.v1
Subject: Mathematics & Computer Science, Algebra & Number Theory Keywords: English vocabulary learning; Incidental vocabulary acquisition,; Context-aware ubiquitous learning,; Ubiquitous Computing; Open-source software
Online: 1 June 2021 (15:24:35 CEST)
Language learners often face communication problems when they need to express themselves and do not have this ability. On the other hand, continuous advances in technology create new opportunities to improve second language (L2) acquisition through context-aware ubiquitous learning (CAUL) technology. Since vocabulary is the foundation of all language acquisition, this article presents the ULearnEnglish, an open-source system to allow ubiquitous English learning focused on incidental vocabulary acquisition. To evaluate the proposal, 15 learners used the system developed, and 10 answered a survey based on the Technology Acceptance Model (TAM). Results indicate a favorable response to the use of the learner context to assist them in their learning. The ULearnEnglish achieved an acceptance of 78.66% for the perception of the utility, 96% for the perception of ease of use, 86% for user context assessment, and 88% for ubiquity. This study presented a positive response in using the location of users to assist their learning. Among the main contributions, this study demonstrates an opportunity for ubiquity use in future research in language learning. Also, furthers studies can use the source available to evolve the model and system.
ARTICLE | doi:10.20944/preprints202105.0449.v1
Subject: Mathematics & Computer Science, Artificial Intelligence & Robotics Keywords: Explainable Artificial Intelligence; Hopfield Neural Networks; Automatic Video Generation; Data-to-text systems; Software Visualization
Online: 19 May 2021 (14:07:48 CEST)
Hopfield Neural Networks (HNNs) are recurrent neural networks used to implement associative memory. Their main feature is their ability to pattern recognition, optimization, or image segmentation. However, sometimes it is not easy to provide the users with good explanations about the results obtained with them due to mainly the large number of changes in the state of neurons (and their weights) produced during a problem of machine learning. There are currently limited techniques to visualize, verbalize, or abstract HNNs. This paper outlines how we can construct automatic video generation systems to explain their execution. This work constitutes a novel approach to get explainable artificial intelligence systems in general and HNNs in particular building on the theory of data-to-text systems and software visualization approaches. We present a complete methodology to build these kinds of systems. Software architecture is also designed, implemented, and tested. Technical details about the implementation are also detailed and explained. Finally, we apply our approach for creating a complete explainer video about the execution of HNNs on a small recognition problem.
ARTICLE | doi:10.20944/preprints202007.0530.v1
Subject: Mathematics & Computer Science, Information Technology & Data Management Keywords: Informatics; Social Informatics; Information Systems; Information System Design; Disruptive Innovation; Technological Determinism; Software Life Cycle
Online: 22 July 2020 (14:07:18 CEST)
Motivation: there is a paradox at the heart of informatics where practical implementation generally fails to understand the socio-technical impact of novel technologies and disruptive innovation when adopted in `real-world’ systems. This phenomenon, termed technological determinism, is manifested in a time-lag between the adoption of novel technologies and an understanding of the underlying theory which develops following research into their adoption. Methods: we consider informatics theory as it relates to: social informatics and how humans’ function in society, the relationship between society and technology, information systems, information systems design, and human-computer interactions. The challenges posed by novel technologies and disruptive innovation are considered as they relate to information systems and information systems design. Open research questions with directions for future research are discussed with an introduction to and our proposed approach to socio-technical information system design. Significance: we conclude that the adoption of disruptive innovation presents both opportunities and threats for all stakeholders in computerised systems. However, determinism is a topic requiring research to generate a suitable level of understanding and technological determinism remains a significant challenge.
ARTICLE | doi:10.20944/preprints201901.0302.v1
Subject: Earth Sciences, Geoinformatics Keywords: interoperability; digital elevation model; Google Sketchup; geographical information systems-science; free and open source software
Online: 30 January 2019 (05:28:53 CET)
Data creation is often the only way for researchers to produce basic geospatial information for the pursuit of more complex tasks and procedures such as those that lead to the production of new data for studies concerning river basins, slope morphodynamics, applied geomorphology and geology, urban and territorial planning, detailed studies, for example, in architecture and civil engineering, among others. This exercise results from a reflection where specific data processing tasks executed in Google Sketchup (Pro version, 2018) can be used in a context of interoperability with Geographical Information Systems (GIS) software. The focus is based on the production of contour lines and Digital Elevation Models (DEM) using an innovative sequence of tasks and procedures in both environments (GS and GIS). It starts in Google Sketchup (GS) graphic interface, with the selection of a satellite image referring to the study area—which can be anywhere on Earth's surface; subsequent processing steps lead to the production of elevation data at the selected scale and equidistance. This new data must be exported to GIS software in vector formats such as Autodesk Design Web format—DWG or Autodesk Drawing Exchange format—DXF. In this essay the option for the use of GIS Open Source Software (gvSIG and QGIS) was made. Correcting the original SHP by removing “data noise” that resulted from DXF file conversion permits the author to create new clean vector data in SHP format and, at a later stage, generate DEM data. This means that new elevation data becomes available, using simple but intuitive and interoperable procedures and techniques which confgures a costless work flow.
ARTICLE | doi:10.20944/preprints202108.0259.v1
Subject: Life Sciences, Other Keywords: SBML; kinetic models; time-course simulation; steady-state simulation; parameter estimation; model calibration; software; web application
Online: 11 August 2021 (12:19:38 CEST)
In systems biology, biological phenomena are often modeled by ODE and distributed in the de facto standard file format SBML. The primary analyses performed with such models are dynamic simulation, steady-state analysis, and parameter estimation. These methodologies are mathematically formalized, and libraries for such analyses have been published. Several tools exist to create, simulate, or visualize models encoded in SBML. However, setting up and establishing analysis environments is a crucial hurdle for non-modelers. Therefore, easy access to perform fundamental analyses of ODE models is a significant challenge. We developed SBMLWebApp, a web-based service to execute SBML-based simulations, steady-state analysis, and parameter estimation directly in the browser without the need for any setup or prior knowledge to address this issue. SBMLWebApp visualizes the result and numerical table of each analysis and provides a download of the results. SBMLWebApp allows users to select and analyze SBML models directly from the BioModels Database. Taken together, SBMLWebApp provides barrier-free access to an SBML analysis environment for simulation, steady-state analysis, and parameter estimation for SBML models. SBMLWebApp is implemented in Java™ based on an Apache Tomcat® web server using COPASI, the SBSCL, and LibSBMLSim as simulation engines. SBMLWebApp is licensed under MIT with source code available from https://github.com/TakahiroYamada/SBMLWebApp. The program runs online at http://simulate-biology.org.
ARTICLE | doi:10.20944/preprints202103.0406.v1
Subject: Mathematics & Computer Science, Algebra & Number Theory Keywords: cyber security; secure development; prototyping; web security; internet of things; software security; digitalization; socio-technical security
Online: 16 March 2021 (09:24:24 CET)
Secure development is a proactive approach to cyber security. Rather than building a technological solution and then securing it in retrospect, secure development strives to embed good security practices throughout the development process and thereby reduces risk. Unfortunately, evidence suggests secure development is complex, costly, and limited in practice. This article therefore introduces security-focused prototyping as a natural precursor to secure development that embeds security at the beginning of the development process, can be used to discover domain specific security requirements, and can help organisations navigate the complexity of secure development such that the resources and commitment it requires are better understood. Two case studies–one considering the creation of a bespoke web platform and the other considering the application layer of an Internet of Things system–verify the potential of the approach and its ability to discover domain specific security requirements in particular. Future work could build on this work by conducting case studies to further verify the potential of security-focused prototyping and even investigate its capacity to be used as a tool capable of reducing a broader, socio-technical, kind of risk.
ARTICLE | doi:10.20944/preprints201910.0032.v1
Subject: Mathematics & Computer Science, Information Technology & Data Management Keywords: computerized revenue collection; machine learning; cyber security; software defined networks; object-oriented programming; online database management
Online: 3 October 2019 (01:45:11 CEST)
The need for the most accurate and flexible system of revenue collection from internal sources has become a matter of extreme urgency and importance in e-governance. This need underscores the eagerness on the part of the Government to look for a new principle and policy of revenue collection or to become aggressive and innovative in the mode of collecting revenue from existing sources using the present system. The Boards of some Governments in Africa, even up to the moment are facing a lot of setbacks in performing their tasks due to the manual system of revenue collection from the public. This can be improved through an effective collection of revenue using the most accurate and flexible system. Tax is usually collected in the form of specific sales tax, general sales tax, corporate income tax, individual income tax, property tax and inheritance tax. Problems such as high cost of collection, fraud, underpayment, leakage in revenue, poor access to information, poor tracking of defaulters is at the increase. As a result of this, there is need to computerize the revenue collection system. Computerized systems have proven to introduce massive efficiencies and quick collection of revenue from the public. This research work demonstrates how to design and implement an automated system of revenue collection and how to maintain a secured database for collected tax information. This research delves into the study of how machine learning algorithms and Software-defined Networks improve the security of such automated systems.
ARTICLE | doi:10.20944/preprints201805.0248.v1
Subject: Mathematics & Computer Science, General & Theoretical Computer Science Keywords: software fault prediction; data preprocessing; feature selection; rough set theory; class imbalance; noise filter; easy ensemble
Online: 17 May 2018 (13:01:51 CEST)
Software fault prediction is the very consequent research topic for software quality assurance. Data driven approaches provide robust mechanisms to deal with software fault prediction. However, the prediction performance of the model highly depends on the quality of dataset. Many software datasets suffers from the problem of class imbalance. In this regard, under-sampling is a popular data pre-processing method in dealing with class imbalance problem, Easy Ensemble (EE) present a robust approach to achieve a high classification rate and address the biasness towards majority class samples. However, imbalance class is not the only issue that harms performance of classifiers. Some noisy examples and irrelevant features may additionally reduce the rate of predictive accuracy of the classifier. In this paper, we proposed two-stage data pre-processing which incorporates feature selection and a new Rough set Easy Ensemble scheme. In feature selection stage, we eliminate the irrelevant features by feature ranking algorithm. In the second stage of a new Rough set Easy Ensemble by incorporating Rough K nearest neighbor rule filter (RK) afore executing Easy Ensemble (EE), named RKEE for short. RK can remove noisy examples from both minority and majority class. Experimental evaluation on real-world software projects, such as NASA and Eclipse dataset, is performed in order to demonstrate the effectiveness of our proposed approach. Furthermore, this paper comprehensively investigates the influencing factor in our approach. Such as, the impact of Rough set theory on noise-filter, the relationship between model performance and imbalance ratio etc. comprehensive experiments indicate that the proposed approach shows outstanding performance with significance in terms of area-under-the-curve (AUC).
ARTICLE | doi:10.20944/preprints201702.0074.v1
Subject: Mathematics & Computer Science, Information Technology & Data Management Keywords: network; systems; cloud computing; data centre; performance; software-defined; virtual machine; scheduling; admission control; application-aware;
Online: 20 February 2017 (04:56:24 CET)
Cloud computing refers to applications delivered as services over the Internet. Cloud systems employ policies that are inherently dynamic in nature and that depend on temporal conditions defined in terms of external events, such as the measurement of bandwidth, use of hosts, intrusion detection or specific time events. In this paper, we investigate an optimized resource management scheme named v-Mapper. The basic premise of v-Mapper is to exploit application-awareness concepts using software-defined networking (SDN) features. This paper makes three key contributions to the field: (1) We propose a virtual machine (VM) placement scheme that can effectively mitigate the VM placement issues for data-intensive applications; (2) We propose a validation scheme that will ensure that a service is entertained only if there are sufficient resources available for its execution and (3) We present a scheduling policy that aims to eliminate network load constraints. An evaluation was carried out with various benchmarks and demonstrated that v-Mapper shows improved performance over other state-of-the-art approaches in terms of average task completion time, service delay time and bandwidth utilization. Given the growing importance of supporting large-scale data processing and analysis in datacentres, the v-Mapper system has the potential to make a positive impact in improving datacentre performance in the future.
REVIEW | doi:10.20944/preprints202207.0022.v1
Subject: Engineering, Electrical & Electronic Engineering Keywords: blockchain; Edge/Fog computing; IIoT architectures; Industry 4.0; interoperability; low latency; reliability; scalability; security; Software-Defined Networking
Online: 1 July 2022 (17:11:41 CEST)
The Industrial Internet of Things (IIoT) is bringing evolution with remote monitoring, intelligent analytics, and control of industrial processes. A reference architecture provides the general layout information for the flexible integration of IIoT systems; however, as the industrial world is currently in its initial stage of adopting the full-stack development solutions with IIoT, some challenges need to be addressed. To cope with the rising challenges and provide the blueprint guidelines to develop and implement IIoT in real-time, researchers around the globe have proposed IIoT architectures based on different architectural layers and emerging technologies. In this paper, we first review and compare some widely accepted IIoT reference architectures and present a state-of-the-art review of conceptual and experimental IIoT architectures in literature. We highlight scalability, interoperability, security, privacy, reliability, and low latency as the main IIoT architectural requirements and compare how the current architectures address these challenges. We also highlight the role of emerging technologies in current IIoT architectures to address these requirements and present the literature gap for future research work to address the challenges.
Subject: Engineering, Automotive Engineering Keywords: Business Intelligence; Data warehouse; Data Marts; Architecture; Data; Information; cloud; Data Mining; evolution; technologic companies; tools; software
Online: 24 March 2021 (13:06:53 CET)
Information has been and will be a vital element for a person or department groups in an organization. That is why there are technologies that help us to give them the proper management of data; Business Intelligence is responsible for bringing technological solutions that correctly and effectively manage the entire volume of necessary and important information for companies. Among the solutions offered by Business Intelligence are Data Warehouses, Data Mining, among other business technologies that working together achieve the objectives proposed by an organization. It is important to highlight that these business technologies have been present since the 50's and have been evolving through time, improving processes, infrastructure, methodologies and implementing new technologies, which have helped to correct past mistakes based on information management for companies. There are questions about Business Intelligence. Could it be that in the not-too-distant future it will be used as an essential standard or norm in any organization for data management, since it provides many benefits and avoids failures at the time of classifying information. On the other hand, Cloud storage has been the best alternative to safeguard information and not depend on physical storage media, which are not 100% secure and are exposed to partial or total loss of information, by presenting hardware failures or security failures due to mishandling that can be given to such information.
BRIEF REPORT | doi:10.20944/preprints202008.0148.v1
Subject: Biology, Other Keywords: Alignment-free software tool; Coronavirus; COVID-19; D614G mutation; Sarbecovirus; SARS-CoV; SARS-CoV-2; Spike glycoprotein
Online: 6 August 2020 (10:12:00 CEST)
As reported by us and others previously (1, 2), the D614G mutation appeared in the spike glycoprotein (SPG) of the SARS-CoV-2 (the pathogen behind COVID-19) at the early stages of the pandemic and then G614 containing variant of SARS-CoV-2 became the predominant strain in most human populations across the world. However, one of the most recent reports from India (3) stated the incidence of G614 to be only 26% in the Indian population. This report is contradictory to the information available through the GenBank (4) SARS-CoV-2 sequence deposits made by various laboratories from India. The above stated report currently circulating in the Indian media is likely to create a public perception that the Indian strain is less contagious and such a notion could be harmful to people’s welfare. In view of this concern we have re-evaluated, updated and recalculated the incidence of the G614 variant in the Indian population by analyzing 395 Indian SARS-CoV-2 genomic sequences available in the GenBank as of June 26, 2020. In our analysis we have categorized the samples by the month in which the samples were collected. We have used an alignment-free software tool named Compare (5, 6), and the Basic Local Alignment Search Tool (BLAST) (7) in the present analysis. We finally inspected each of the 395 sequences physically for the presence of aspartic acid (D) or glycine (G) at the 614th position of the spike glycoprotein. We analyzed an Australian cohort in parallel for comparison. We have found that the prevalence of G614 variant in the Indian samples for the month of June 2020 is 90.6%. The trends are similar with the Australian samples.
ARTICLE | doi:10.20944/preprints201911.0130.v1
Subject: Biology, Ecology Keywords: bacterial calcium-carbonate precipitation (BCP); calcifying bacteria selection; calcifying mixed cultures; imagej software; biolog ecoplates; sand biocementation
Online: 12 November 2019 (16:06:06 CET)
Bacterial Calcium-carbonate Precipitation (BCP) has been studied for multiple applications such as remediation, consolidation and cementation. Isolation and screening of strong calcifying bacteria is the main task of BCP-technique. In this paper we studied CaCO3 precipitation by different bacteria isolated from a rhizospheric soil in both solid and liquid media. It has been found, through culture-depending studies, that bacteria belonging to Actinobacteria, Gammaproteobacteria and Alphaproteobacteria are the dominant bacteria involved in CaCO3 precipitation in this environment. Pure and mixed cultures of selected strains were applied for sand biocementation experiments. Scanning Electron Microscopy (SEM) and Energy Dispersive X-ray Spectroscopy (EDS) analyses of the biotreated samples revealed the biological nature of the cementation and the effectiveness of the biodeposition treatment by mixed cultures. X-ray diffraction (XRD) analysis confirmed that all the calcifying strains selected for sand biocementation precipitated CaCO3, mostly in the form of calcite. In this study Biolog® Eco-plate is evaluated as a useful method for a more targeted choice of the sampling site with the purpose of obtaining interesting candidates for BCP applications. Furthermore, ImageJ software was investigated, for the first time to our knowledge, as a potential method to screen high CaCO3 producer strains.
DATA DESCRIPTOR | doi:10.20944/preprints202209.0323.v1
Subject: Mathematics & Computer Science, Information Technology & Data Management Keywords: COVID-19; Open-source dataset; Drug Repurposing; Database system; Web application devel-opment; software development; Drug fingerprints; Bulk upload
Online: 21 September 2022 (10:14:11 CEST)
Although various vaccines are now commercially available, they have not been able to stop the spread of COVID-19 infection completely. An excellent strategy to quickly get safe, effective, and affordable COVID-19 treatment is to repurpose drugs that are already approved for other diseases as adjuvants along with the ongoing vaccine regime. The process of developing an accurate and standardized drug repurposing dataset requires a considerable level of resources and expertise due to the commercial availability of an extensive array of drugs that could be potentially used to address the SARS-CoV-2 infection. To address this bottleneck, we created the CoviRx platform. CoviRx is a user-friendly interface that provides access to the data, which is manually curated for COVID-19 drug repurposing data. Through CoviRx, the data curated has been made open-source to help advance drug repurposing research. CoviRx also encourages users to submit their findings after thoroughly validating the data, followed by merging it by enforcing uniformity and integ-rity-preserving constraints. This article discusses the various features of CoviRx and its design principles. CoviRx has been designed so that its functionality is independent of the data it dis-plays. Thus, in the future, this platform can be extended to include any other disease X beyond COVID-19. CoviRx can be accessed at www.covirx.org.
ARTICLE | doi:10.20944/preprints202209.0134.v1
Subject: Medicine & Pharmacology, General Medical Research Keywords: nasal function; validation; software; nasal resistance; rhinomanometry; acoustic rhinometry; peak nasal inspiratory flow meter; practice patterns; objective measurement outcomes; parameters
Online: 9 September 2022 (09:41:14 CEST)
Background: The Davidson Airway Function & Nasal Evaluation (DAFNE) Scoring System was developed as an intuitive and research-based scoring system that could be validated through beta testing and easily introduced to healthcare providers of several subspecialties who treat nasal obstruction and breathing disorders (MDs, PAs, PTs, APRNs, DDSs, and DCs). This scoring system was shown to increase the knowledge of airway function, nasal measurement parameters, and identification of proper treatment options for sleep and breathing disorders. The basis for the DAFNE score was developed from a systematic review of nasal measurement data. Methods: Electronic searches of PubMed, MEDLINE, EMBASE Cochrane Library, and Scopus of publications between 1988-2022 were used to identify studies validating nasal function measurement parameters to create the algorithm for the DAFNE Score™. The systematic review was accomplished using the 2020 ‘Preferred Reporting Items for Systematic Reviews’ (PRISMA) guidelines. Results: Twenty studies met the inclusion criteria for systematic review. Primary outcomes measurements demonstrated reliability, repeatability and validity of the DAFNE measurement technologies, data and output. Conclusions: The data analysis and systematic review uncovered a need and framework to develop and validate a web-based software algorithm for global access to improve the understanding of data interpretation of nasal measurements from three nasal measurement technologies. DAFNE Scoring should be used as an adjunct tool in routine clinical practice and research to further understand the technology data output and how to collaborate with other healthcare providers to improve patient outcomes.
ARTICLE | doi:10.20944/preprints201711.0098.v1
Subject: Social Sciences, Econometrics & Statistics Keywords: Paris 2015 Agreement; CO2 emissions; VAR models; Granger causality; impulse response functions; forecast error variance decomposition; software: R; MTS; RATS
Online: 15 November 2017 (18:34:09 CET)
In this paper a dynamic relationship between the CO2 emissions in Finland, Norway and Sweden is presented. With the help of a VAR(2) model, and using the Granger terminology, it is shown that the emissions in Finland are affecting those in Norway and Sweden. Other aspects of this dynamic relationship are presented as well.
ARTICLE | doi:10.20944/preprints202208.0068.v1
Subject: Engineering, Electrical & Electronic Engineering Keywords: Frequency estimation; FM; sensors; Internet of Things (IoT); software-defined radio (SDR); al-pha-stable noise; time-frequency distribution; deep learning
Online: 3 August 2022 (03:15:26 CEST)
Deep Learning (DL) and Machine Learning (ML) are widely used in many fields, but rarely used in Frequency Estimation (FE) and Slope Estimation (SE) of signals. Frequency and slope estimation for Frequency-Modulated (FM) and single-tone sinusoidal signals are essential in various applications, such as wireless communications, sonar, and radar measurements. In this work, artificial neural network (ANN) and convolutional neural network (CNN) are used in frequency and slope estimation for FM signals under Additive White Gaussian Noise (AWGN) and Additive Symmetric alpha Stable Noise (SαSN). SαS distributions are impulsive noise disturbances found in many communication environments like marine systems; their distribution lacks a closed-form Probability Density Function (PDF), except for specific cases, and infinite second-order statistic, hence Geometric SNR (GSNR) is used in this work to determine the impulsiveness of noise in a mixture of Gaussian and SαS noise processes. ANN is a machine learning classifier, designed with few layers for reducing FE and SE complexity while getting higher accuracy as compared with classical techniques. CNN is a deep learning classifier, designed with many layers for FE and SE, and proved to be more accurate than ANN when dealing with big data and finding optimal features. Simulation results show that SαS noise can be much more harmful for FE and SE of FM signals than Gaussian noise. DL and ML can significantly reduce FE complexity, memory cost, and power consumption, which is important in many systems such as some Internet of Things (IoT) sensor applications. After training DCNN for frequency and slope estimation of LFM signals, the performance of DCNN (in terms of accuracy) can give acceptable results at very low signal-to-noise ratios where TFD fails, giving more than 20dB difference in the GSNR working range.