ARTICLE | doi:10.20944/preprints201704.0031.v1
Online: 5 April 2017 (18:09:25 CEST)
Models come in different forms: visual, arithmetic, mental, physical. The most common type of model is arguably the mental model, which people use to view and interpret the world. A model can be described as a representation of a problem or a situation – a simplified representation. The process of building or developing a model is called modeling. A model once developed by the modeller, can be ‘owned’ by a manager or decision maker. The ideal is to make the model an extension of the user’s ability to think about and analyse problems or situations. When used properly – taking into consideration its limitations – an economic model for a project can provide insight for decision makers, when making the crucial decision to approve a project. An economic model for a liquefied natural gas (LNG) project is shown as an example.
CONCEPT PAPER | doi:10.20944/preprints202301.0049.v1
Subject: Social Sciences, Education Studies Keywords: Quality Assurance in Higher Education in Zambia; Higher Education and Quality Assurance; Zambia’s Higher Education
Online: 4 January 2023 (02:58:38 CET)
This paper discussed the concept of quality assurance (QA) in higher education and its implications to higher education institutions (HEIs) and the possible challenges. The study evaluated literature concerning QA in Zambia and elsewhere. The findings of the study show that QA is implemented through external and internal mechanisms such as accreditation, registration, institutional auditing, and the use of external examiners, self-evaluation, and peer reviews. The QA implications to HEIs in Zambia are that there is a need for accreditation of academic programmes with the Higher Education Authority (HEA) by HEIs. Further, HEIs should establish QA units to spearhead quality issues, reactive, and introduce the use of external examiners to ensure quality. The challenges identified in the implementation of QA in HEIs include inadequate funding, infrastructure, shortage of qualified academic staff, and lack of standalone QA units in some HEIs. In this regard, it has been recommended among other things, government improve funding in public HEIs, construct infrastructure, HEIs establish QA units, and recruit and retain qualified academic staff to ensure the quality of education.
SHORT NOTE | doi:10.20944/preprints202006.0259.v1
Online: 21 June 2020 (11:01:58 CEST)
The detection of SARS-Cov-2 in the sewage and water resources has increased the awareness among the people about the possibility survival of SARS-Cov-2 in the environment and the potential to transmit into the human through food chain or water resources. Moreover, the surface contaminated by the virus need to be disinfected frequently by using an effective disinfectant, the current chapter discussed the efficiency of the most traditional treatment process of the sewage and wastewater, and their role in the elimination of the virus as well as the sterility assurance level concept. Moreover, the chemical disinfectant used currently and their temporary efficiency has been reviewed.
ARTICLE | doi:10.20944/preprints202110.0411.v1
Subject: Engineering, Industrial & Manufacturing Engineering Keywords: Automotive; Resistance Spot Welding; Quality Assurance; Quality Monitoring; Artificial Intelligence
Online: 27 October 2021 (13:27:03 CEST)
Resistance spot welding is an established joining process in the production of safety-relevant components in the automotive industry. Therefore, a consecutive process monitoring is essential to meet the high-quality requirements. Artificial neural networks can be used to evaluate the process parameters and signals to ensure the individual spot weld quality. The predictive accuracy of such algorithms depends on the provided training data set and the prediction of untrained data is challenging. The aim of this paper is to investigate the extrapolation capability of the multi-layer perceptron model. That means, that the predictive performance of the model will be tested with data that clearly differs from the training data in terms of material and coating composition. Therefore, three multi-layer perceptron regression models were implemented to predict the nugget diameter from process data. The three models were able to predict the trained datasets very well. The models, which were provided with features from the dynamic resistance curve predicted the new dataset better than the model with only process parameters. This study shows the beneficial influence of the process signals on the predictive accuracy and robustness of artificial neural network algorithms. Especially, when predicting a data set from outside of the training space.
ARTICLE | doi:10.20944/preprints202010.0092.v1
Subject: Engineering, Automotive Engineering Keywords: ANOVA; Asphalt Binder Extraction; Quality Assurance; Quality Control; RAP; RAS
Online: 5 October 2020 (14:06:31 CEST)
Asphalt binder requires more investigation to be accurately and precisely extracted since it is a significant procedure for quality control quality assurance (QC/QA) and subsequent binder characterization. In this research, the authors provided a hands-on experience with binder extraction to deliver recommendations concerning the sensitive steps that may affect the outcomes (extracted binder content, Pbe%). Based on the extraction by the centrifuge method, two mineral matter determination methods (ashing and centrifuge) were addressed. Field cores were investigated with comparing the Pbe% to the actual binder content, Pba%. Analysis of variance (ANOVA) and Tukey Post-Hoc statistical analyses, in addition to linear least square regression analysis, were used to show the significance of difference according to 38 variant cores randomly obtained from the field segments (in-service roads) via the first two weeks from the construction date. Such cores involved reclaimed asphalt pavement (RAP), reclaimed asphalt shingles (RAS), and a wide range of additives. The two extraction methods were compared with concluding that the centrifuge method was highly recommended based on a quantitative evaluation, which delivered the same average Pba% based on the 38 cores. Furthermore, the centrifuge method provided much saving in the experimental time (almost half the time required for the ashing method). It was found that the ashing outcomes were equal to the centrifuge outcomes with disregarding the ammonium carbonate addition. Thus, it could be recommended to reassess the ammonium carbonate addition as it might excessively compensate for fake minerals that have not been lost by the ignition oven.
Subject: Keywords: Internet of Things (IoT); Quality Assurance; Testing; Artificial Intelligence (AI)
Online: 9 December 2019 (07:39:47 CET)
IoT is a fast growing technology that has Promising potential for shaping our future. In this fast growing world of IoT, IoT systems are released without proper testing which effect its quality and does not guarantee user satisfaction. Different testing methodologies are carried out to ensure Quality assurance of IoT before releasing it to the market. In this paper we have reviewed different testing techniques using AI and different tools to ensure Quality of IoT. In this paper we have also reviewed different IoT challenges related to its quality.
Subject: Engineering, Other Keywords: SACDM; SOS; SQA; key factors software quality assurance; Scrum; stakeholder
Online: 9 December 2019 (07:37:30 CET)
The main moto of this study is to examine and study on behavior of Software Quality Assurance (SQA) issues of project stakeholders in a Scrum environment and their consequences. This inductive case study identifies SQA principles relevant to Meeting User Expectations. The Stakeholders in the Scrum project having lack of Concrete Guidance on Scrum’s SQA approaches, methods, and techniques. The insufficiency of concrete guidelines in Scrum needs a management squad to develop concepts that can include implementing practices from other methodologies and wisely modifying the system structure to incorporate the practices adopted, ensuring improvement in the processes, and creating a shared ownership environment. Through explaining the incompleteness of Agile approaches with special attention to the lack of concrete instructions in Scrum, the study uses techniques to customize literature and advocate for Scrum’s versatility. The study uses strategies to adapt literature and argue for Scrum’s simplicity by illustrating the incompleteness of Agile approaches with special attention to the lack of concrete instructions in Scrum methodology.
ARTICLE | doi:10.20944/preprints202211.0015.v1
Subject: Mathematics & Computer Science, Information Technology & Data Management Keywords: Unmanned Aerial Vehicle (UAV); Lightweight Blockchain; Drone Security; assurance; authentication; resilience
Online: 1 November 2022 (04:07:30 CET)
Rapid advancements in the fifth generation (5G) communication technology and mobile edge computing (MEC) paradigm lead to the proliferation of unmanned aerial vehicles (UAV) in urban air mobility (UAM) networks, which provide intelligent services for diversified smart city scenarios. Meanwhile, the widely deployed internet of drones (IoD) in smart cities also brings up new concerns on performance, security, and privacy. The centralized framework adopted by conventional UAM networks is not adequate to handle high mobility and dynamicity. Moreover, it is necessary to ensure device authentication, data integrity, and privacy preservation in UAM networks. Thanks to characteristics of decentralization, traceability, and unalterability, Blockchain is recognized as a promising technology to enhance security and privacy for UAM networks. In this paper, we introduce LightMAN, a lightweight microchained fabric for data assurance and resilience-oriented UAM networks. LightMAN is tailored for small-scale permissioned UAV networks, in which a microchain acts as a lightweight distributed ledger for security guarantees. Thus, participants are enabled to authenticate drones and verify the genuineness of data that is sent to/from drones without relying on a third-party agency. In addition, a hybrid on-chain and off-chain storage strategy is adopted that not only improves performance (e.g,.latency and throughput) but also ensures privacy preservation for sensitive information in UAM networks. A proof-of-concept prototype is implemented and tested on a Micro Air Vehicle Link (MAVLink) simulator. The experimental evaluation validates the feasibility and effectiveness of the proposed LightMAN solution.
REVIEW | doi:10.20944/preprints202106.0320.v1
Subject: Keywords: RT-qPCR; assay validity; standard curve; quality assurance; quality control; wastewater surveillance
Online: 11 June 2021 (14:10:51 CEST)
The coronavirus disease 2019 (COVID-19) pandemic has led to wastewater surveillance becoming an important tool for monitoring the spread of severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) within communities. As a result, molecular methods, in particular reverse transcription-quantitative PCR (RT-qPCR), have been employed to generate large data sets aimed at the detection and quantification of SARS-CoV-2 in wastewater. Although RT-qPCR is rapid and sensitive, there is no standard method that fits all use cases, there are no certified quantification standards and experiments are carried out using numerous different assays, reagents, instruments, and data analysis protocols. These variations can lead to the reporting of erroneous quantitative data resulting in potentially misleading interpretations and conclusions. We have reviewed the SARS-CoV-2 wastewater surveillance literature focusing on the variability of RT-qPCR data as revealed by inconsistent standard curves and associated parameters. We find that variation in these parameters and deviations from best practices as described in The Minimum Information for Publication of Quantitative Real-Time PCR Experiments (MIQE) guidelines suggest a lack of reproducibility and reliability in quantitative measurements of SARS-CoV-2 RNA in wastewater.
ARTICLE | doi:10.20944/preprints202012.0064.v1
Subject: Engineering, Other Keywords: Smart City, Urban ICT, Open Urban Platforms, Sustainable Cities, Resiliency, Quality Assurance
Online: 2 December 2020 (14:08:40 CET)
Information and Communication Technology (ICT) is at the heart of the Smart City approach, which constitutes the next level of cities’ and communities’ development across the globe. Thereby, ICT serves as the gluing component enabling different domains to interact with each other and facilitating the management and processing of vast amounts of data and information towards intelligently steering the cities infrastructure and processes, engaging the citizens and facilitating new services and applications on various aspects of urban life - e.g. supply chains, mobility, transportation, energy, citizens’ participation, public safety, interactions between citizens and the public administration, water management, parking and many other use cases and domains. Hence, given the fundamental role of ICT in cities in the near future, it is of paramount importance to lay the ground for a sustainable and reliable ICT infrastructure, which can enable a city/community to respond in a resilient way to upcoming challenges whilst increasing the quality of life for its citizens. This paper constitutes a continuation of a series of research documents and standardization activities, which relate to the concept of Open Urban Platforms (OUP) and the way they serve as a blueprint for each city/community towards the establishment of an ICT backbone. Thereby, the current paper emphasizes on the aspects of sustainability and resilient ICT, whilst reporting on our latest activities and related developments in the research area.
Subject: Keywords: myocardial perfusion imaging; positron emission tomography; Rubidium-82 generator; daily quality assurance
Online: 22 April 2020 (07:32:27 CEST)
Introduction. Strontium-82/Rubidium-82 (82Sr/82Rb) generators are used widely for positron emission tomography (PET) imaging of myocardial perfusion. In this study, the 82Rb isotope yield and production efficiency of two FDA-approved 82Sr/82Rb generators were compared.Methods. N=515 sequential daily quality assurance (QA) reports from 9 CardioGen-82® and 9 RUBY-FILL® generators were reviewed over a period of 2 years. A series of test elutions was performed at different flow-rates on the RUBY-FILL® system to determine an empirical correction-factor used to convert CardioGen-82® daily QA values of 82Rb activity (dose-calibrator ‘maximum’ of 50 mL elution at 50 mL/min) to RUBY-FILL® equivalent values (integrated ‘total’ of 35 mL elution at 20 mL/min). The generator yield (82Rb) and production efficiency (82Rb yield / 82Sr parent activity) were measured and compared after this conversion to a common scale. Results. At the start of clinical use, the system reported 82Rb activity from daily QA was lower for CardioGen-82® vs RUBY-FILL® (2.3 ± 0.2 vs 3.0 ± 0.2 GBq, p<0.001) despite having similar 82Sr activity. Dose-calibrator ‘maximum’ (CardioGen-82®) values were found to under-estimate the integrated ‘total’ (RUBY-FILL®) activity by ~24% at 50 mL/min. When these data were used to convert the CardioGen-82 values to a common measurement scale (integrated total activity) the CardioGen-82® efficiency remained slightly lower than the RUBY-FILL® system on average (88 ± 4% vs 95 ± 4%, p<0.001). The efficiency of 82Rb production improved for both systems over the respective periods of clinical use.Conclusions. 82Rb generator yield was significantly under-estimated using the CardioGen-82® vs RUBY-FILL® daily QA procedure. When generator yield was expressed as the integrated total activity for both systems, the estimated 82Rb production efficiency of the CardioGen-82® system was ~7% lower than RUBY-FILL® over the full period of clinical use.
Subject: Engineering, Other Keywords: Internet of Things(IoTs); Challenges; Test Strategies; Quality Assurance; Suggestion; Interoperability; Security
Online: 10 December 2019 (16:15:13 CET)
Immense challenges arise in the Quality Assurance area due to contemporary development in Internet of Things (IoT) technology. Current issues are mainly related to test coverage, test diversity, IoT Stability, Use of Cellular Networks in IoTs, IoT Devices updates, Security, Data Integration, and interoperability. In this paper, we present all those issues with suggestions for tackling those issues.
ARTICLE | doi:10.20944/preprints202103.0135.v1
Subject: Mathematics & Computer Science, Algebra & Number Theory Keywords: Machine Learning Applications; Quality Assurance Methodology; Process Model; Automotive Industry and Academia; Best Practices; Guidelines
Online: 3 March 2021 (14:11:09 CET)
Machine learning is an established and frequently used technique in industry and academia but a standard process model to improve success and efficiency of machine learning applications is still missing. Project organizations and machine learning practitioners have a need for guidance throughout the life cycle of a machine learning application to meet business expectations. We therefore propose a process model for the development of machine learning applications, that covers six phases from defining the scope to maintaining the deployed machine learning application. The first phase combines business and data understanding as data availability oftentimes affects the feasibility of the project. The sixth phase covers state-of-the-art approaches for monitoring and maintenance of a machine learning applications, as the risk of model degradation in a changing environment is eminent. With each task of the process, we propose quality assurance methodology that is suitable to address challenges in machine learning development that we identify in form of risks. The methodology is drawn from practical experience and scientific literature and has proven to be general and stable. The process model expands on CRISP-DM, a data mining process model that enjoys strong industry support but lacks to address machine learning specific tasks. Our work proposes an industry and application neutral process model tailored for machine learning applications with focus on technical tasks for quality assurance.
ARTICLE | doi:10.20944/preprints202011.0585.v1
Subject: Engineering, Automotive Engineering Keywords: virtual reality; point cloud data; building information modeling; work inspection; progress monitoring; quality assurance inspection.)
Online: 23 November 2020 (14:03:47 CET)
Despite recent developments in monitoring and visualizing construction progress, the data exchange between construction jobsite and office lacks automation and real-time recording. To address this issue, a near real-time construction work inspection system called iVR is proposed; this system integrates 3D scanning, extended reality, and visual programming to visualize interactive onsite inspection for indoor activities and provide numeric data. iVR comprises five modules: iVR-location finder (finding laser scanner located in the construction site) iVR-scan (capture point cloud data of jobsite indoor activity), iVR-prepare (processes and convert 3D scan data into a 3D model), iVR-inspect (conduct immersive visual reality inspection in construction office), and iVR-feedback (visualize inspection feedback from jobsite using augmented reality). An experimental lab test is conducted to validate the applicability of iVR process; it successfully exchanges required information between construction jobsite and office in a specific time. This system is expected to assist Engineers and workers in quality assessment, progress assessments, and decision-making which can realize a productive and practical communication platform, unlike conventional monitoring or data capturing, processing, and storage methods, which involve storage, compatibility, and time-consumption issues.
ARTICLE | doi:10.20944/preprints202211.0034.v1
Subject: Mathematics & Computer Science, Information Technology & Data Management Keywords: Blockchain; Smart Contract; Point Cloud; Security; Privacy Preservation; Software-Defined Network (SND); Big Data; Assurance; Resilience.
Online: 2 November 2022 (02:18:50 CET)
The rapid development of three-dimensional (3D) acquisition technology based on 3D sensors provides a large volume of data, which is often represented in the form of point clouds. Point cloud representation can preserve the original geometric information along with associated attributes in a 3D space. Therefore, it has been widely adopted in many scene-understanding-related applications such as virtual reality (VR) and autonomous driving. However, the massive amount of point cloud data aggregated from distributed 3D sensors also poses challenges for secure data collection, management, storage, and sharing. Thanks to the characteristics of decentralization and security nature, Blockchain has a great potential to improve point cloud services and enhance security and privacy preservation. Inspired by the rationales behind Software Defined Network (SDN) technology, this paper envisions SAUSA, a blockchain-based authentication network that is capable of recording, tracking, and auditing the access, usage, and storage of 3D point cloud data sets in their life-cycle in a decentralized manner. SAUSA adopts an SDN-enabled point cloud service architecture which allows for efficient data processing and delivery to satisfy diverse Quality-of-Service (QoS) requirements. A blockchain-based authentication framework is proposed to ensure security and privacy preservation in point cloud data acquisition, storage, and analytics. Leveraging smart contracts for digitizing access control policies and point cloud data on the blockchain, data owners have full control of their 3D sensors and point clouds. In addition, anyone can verify the authenticity and integrity of point clouds in use without relying on a third party. Moreover, SAUSA integrates a decentralized storage platform to store encrypted point clouds while recording references of raw data on the distributed ledger. Such a hybrid on-chain and off-chain storage strategy not only improves robustness and availability but also ensures privacy preservation for sensitive information in point cloud applications. A proof-of-concept prototype is implemented and tested on a physical network. The experimental evaluation validates the feasibility and effectiveness of the proposed SAUSA solution.
ARTICLE | doi:10.20944/preprints201710.0016.v1
Subject: Mathematics & Computer Science, Information Technology & Data Management Keywords: citizen science; volunteered geographical information; metadata; data quality; quality assurance; scientific workflow; provenance; metaquality; open data
Online: 3 October 2017 (13:52:29 CEST)
Environmental policy involving citizen science (CS) is of growing interest. In support of this open data stream, validation or quality assessment of the CS data and their appropriate usage for evidence-based policy making, needs a flexible and easily adaptable data curation process ensuring transparency. Addressing these needs, this paper describes an approach for automatic quality assurance as proposed by the Citizen OBservatory WEB (COBWEB) FP7 project. This approach is based upon a workflow composition that combines different quality controls, each belonging to seven categories or ‘pillars’. Each pillar focuses on a specific dimension in the types of reasoning algorithms for CS data qualification. These pillars attribute values to a range of quality elements belonging to three complementary quality models. Additional data from various sources, such as Earth Observation (EO) data, are often included as part of the inputs of quality controls within the pillars. However, qualified CS data can also contribute to the validation of EO data. Therefore, the question of validation can be considered as ‘two sides of the same coin’. Based on an invasive species CS study, concerning Fallopia japonica (Japanese knotweed), the paper discusses the flexibility and usefulness of qualifying CS data, either when using an EO data for the validation within the quality assurance process, or validating an EO data product that describes the risk of occurrence of the plant. Both validation paths are found to be improved by quality assurance of the CS data. Addressing the reliability of CS open data, issues and limitations of the role of quality assurance for validation, due to the quality of secondary data used within the automatic workflow, are described, e.g. error propagation, paving the route to improvements in the approach.
Subject: Mathematics & Computer Science, Numerical Analysis & Optimization Keywords: harmony search; meta-heuristic; parameter optimization; software defect prediction; just-in-time prediction; software quality assurance; maintenance; maritime transportation
Online: 31 December 2020 (09:27:46 CET)
Software is playing the most important role in recent vehicle innovation, and consequently the amount of software has been rapidly growing last decades. Safety-critical nature of ships, one sort of vehicles, makes Software Quality Assurance (SQA) has gotten to be a fundamental prerequisite. Just-In-Time Software Defect Prediction (JIT-SDP) aims to conduct software defect prediction (SDP) on commit-level code changes to achieve effective SQA resource allocation. The first case study of SDP in maritime domain reported feasible prediction performance. However, we still consider that the prediction model has still rooms for improvement since the parameters of the model are not optimized yet. Harmony Search (HS) is a widely used music-inspired meta-heuristic optimization algorithm. In this article, we demonstrated that JIT-SDP can produce the better performance of prediction by applying HS-based parameter optimization with balanced fitness value. Using two real-world datasets from the maritime software project, we obtained an optimized model that meets the performance criterion beyond baseline of previous case study throughout various defect to non-defect class imbalance ratio of datasets. Experiments with open source software also showed better recall for all datasets despite we considered balance as performance index. HS-based parameter optimized JIT-SDP can be applied to the maritime domain software with high class imbalance ratio. Finally, we expect that our research can be extended to improve performance of JIT-SDP not only in maritime domain software but also in open source software.
ARTICLE | doi:10.20944/preprints201806.0155.v1
Subject: Earth Sciences, Space Science Keywords: essential climate variables; climate data records; earth observation satellites; quality assurance; traceability; user requirements; climate applications; surface albedo; LAI; FAPAR; NO2; HCHO; CO
Online: 11 June 2018 (11:24:59 CEST)
Data from Earth Observation (EO) satellites are increasingly used to monitor the environment, understand variability and change, inform evaluations of climate model forecasts and manage natural resources. Policy makers are progressively relying on the information derived from these datasets to make decisions on mitigating and adapting to climate change. These decisions should be evidence based, which requires confidence in derived products as well as the reference measurements used to calibrate, validate or inform product development. In support of the European Union’s Earth Observation Programmes Copernicus Climate Change Service, the Quality Assurance for Essential Climate Variables (QA4ECV) project fulfilled a gap in the delivery of climate quality satellite derived datasets by prototyping a robust, generic system for the implementation and evaluation of Quality Assurance (QA) measures for satellite-derived ECV climate data record products. The project demonstrated the QA system on six new long-term, climate quality ECV data records for surface Albedo, Leaf Area Index, FAPAR, NO2, HCHO and CO. Provision of standardized QA information provides data users with evidence-based confidence in the products and enables judgement on the fitness-for-purpose of various ECV data products their specific applications.
ARTICLE | doi:10.20944/preprints202010.0224.v1
Subject: Engineering, Control & Systems Engineering Keywords: Remotely Piloted Aircraft Systems; RPAS; Unmanned Aerial Vehicles; UAV; Unmanned Aerial Systems; UAS; Detect and Avoid; DAA; Separation Assurance; Self Separation; Collision Avoidance; Situational Awareness; Drones; Aircraft; ADS-B; Real Time Simulations
Online: 12 October 2020 (10:31:56 CEST)
Remotely Piloted Aircraft Systems (RPAS) are increasingly becoming relevant actors flying through the airspace and will assume much more importance in the future perspective. In order to allow their safe integration with manned conventional traffic in non-segregated airspaces, in accordance with the overall Air Traffic Management (ATM) paradigm, specific enabling technologies are needed. As well known, among the enabling technologies identified as crucial for RPAS integration into the overall ATM system, the Detect and Avoid (DAA) technology is fundamental. In the meantime, to support extended surveillance, the universal introduction on-board of aircraft of cooperative Automatic Dependent Surveillance – Broadcast (ADS-B) is increasingly implemented, having the potential to allow coverage of the whole airspace also in remote areas not usually covered by conventional radar surveillance. In this paper, the experimental results are presented and discussed that have been obtained through the real-time validation, with hardware and human in the loop (RTS-HIL) simulations, of an automatic ADS-B based Separation Assurance and Collision Avoidance System aimed to support RPAS automatic operations as well as remote pilot decision making. In the paper, after an introductory outline of the Concept of Operations (ConOps) of the system and of its architectural organization, while also providing basic information about the main system functionalities, the description is reported of the tests that have been carried out and the obtained results are described and discussed, in order to emphasize the performances and limitations of the proposed system. In particular, not only the quantitative performances obtained are reported and commented but also the feedbacks received by the pilots in order to improve the system are described, for instance in terms of preferred typology of conflict resolution manoeuver elaborated by the system.