ARTICLE | doi:10.20944/preprints202305.0525.v1
Subject: Computer Science And Mathematics, Computer Science Keywords: datasets; standards; standardization; guidelines; framework; interoperability.
Online: 8 May 2023 (10:49:34 CEST)
With the recent advances in science and technology, more processing capability and data have become available, allowing a more straightforward implementation of data analysis techniques. Fortunately, the available online data storage capacity follows this trend, and vast amounts of data can be stored online freely or at accessible costs. As happens with every evolution (or revolution) in any science field, organizing and sharing this data is essential to contribute to new studies or validate the obtained results quickly. To facilitate this, we must guarantee interoperability between the existing datasets and the developed software, whether commercial or open source. This article explores this issue, analyzing the current initiatives to establish data standards and comparing some of the existing dataset online storage platforms. Through a Strengths, Weaknesses, Opportunities, and Threats (SWOT) analysis, it is possible to understand better the strategy that should be taken to improve the efficiency in this field, which directly depends on the data characteristics.
ARTICLE | doi:10.20944/preprints202112.0286.v2
Subject: Engineering, Control And Systems Engineering Keywords: data integration; interoperability; harmonization; GeoBIM; metadata
Online: 7 June 2022 (11:10:07 CEST)
The reuse and integration of data give big opportunities, supported by the F.A.I.R. data principles. Seamless data integration from heterogenous sources has been interest of the geospatial community for long time. However, 3D city models, BIM and information supporting smart cities present higher semantic and geometrical complexity, which pose new challenges, never tackled in a comprehensive methodology. Building on previous theories and studies, this paper proposes an overarching workflow and framework for multisource (geo)spatial data integration. It starts from the definition of use case-based requirements for the integrated data, guides the analysis of integrability of the involved datasets, suggesting actions to harmonise them, until data merging and validation. It is finally tested and exemplified on a case study. This approach allows the development of consistent, well-documented and inclusive data integration workflows, for the sake of use cases automation in various geospatial domains and the production of Interoperable and Reusable data.
CONCEPT PAPER | doi:10.20944/preprints202001.0176.v1
Subject: Computer Science And Mathematics, Computer Science Keywords: community cyberinfrastructure; accessibility; reproducibility; interoperability; models
Online: 17 January 2020 (04:28:34 CET)
In an era of rapid global change, our ability to understand and predict Earth's natural systems is lagging behind our ability to monitor and measure changes in the biosphere. Bottlenecks in our ability to process information have reduced our capacity to fully exploit the growing volume and variety of data. Here, we take a critical look at the information infrastructure that connects modeling and measurement efforts, and propose a roadmap that accelerates production of new knowledge. We propose that community cyberinfrastructure tools can help mend the divisions between empirical research and modeling, and accelerate the pace of discovery. A new era of data-model integration requires investment in accessible, scalable, transparent tools that integrate the expertise of the whole community, not just a clique of ‘modelers’. This roadmap focuses on five key opportunities for community tools: the underlying backbone to community cyberinfrastructure; data ingest; calibration of models to data; model-data benchmarking; and data assimilation and ecological forecasting. This community-driven approach is key to meeting the pressing needs of science and society in the 21st century.
ARTICLE | doi:10.20944/preprints202308.0994.v1
Subject: Engineering, Civil Engineering Keywords: BIM methodology; BIM manager; coordination; integration; interoperability
Online: 14 August 2023 (07:11:18 CEST)
Abstract: Building Information Modelling (BIM) methodology has been empowering the quality of the construction activity in all sectors: multidisciplinary designs development; construction planning and monitoring; building management and maintenance. A BIM environment aggre-gates several disciplines and different professional skillsets and in order to control and improve the quality of a BIM project, a BIM manager is required. The BIM manager has the responsibility to coordinate all tasks involved in a building design and associated activities usually workout over the project documents. This professional can access to the distinct discipline models, located in a shared platform, and request for amendments if inconsistencies are detected. The topic of the present study is illustrated with three building cases were distinct specific projects, disciplines and tasks were elaborated: collaboration between disciplines (architecture, structures and con-struction); structural analyses and reinforcement details; quantity take-off of materials and cost estimation; construction scheduling and simulation. Although there are still limitations in the implementation of BIM methodology in all sectors and stages, within the construction industry, BIM has been bringing an important improvement in the quality of a building design, reflected in the quality of the final product. The present study put in evidence the BIM manager role in pro-jects that aggregates several disciplines and experts, bringing an important contribution in the quality of a building design. BIM methodology is a current demand in the construction industry supported on advanced technology and in an adequate management of projects.
REVIEW | doi:10.20944/preprints202204.0078.v2
Subject: Medicine And Pharmacology, Pathology And Pathobiology Keywords: digitalization; clinical chemistry; artificial intelligence; interoperability; FAIRification
Online: 15 July 2022 (05:00:44 CEST)
Laboratory medicine is a digital science. Every large hospital produces a wealth of data each day - from simple numerical results from e.g. sodium measurements to highly complex output of “-omics” analyses, as well as quality control results and meta-data. Processing, connecting, storing, and ordering extensive parts of these individual data requires Big Data techniques. Whereas novel technologies such as artificial intelligence and machine learning have exciting application for the augmentation of laboratory medicine, the Big Data concept remains fundamental for any sophisticated data analysis in large databases. To make laboratory medicine data optimally usable for clinical and research purposes, they need to be FAIR: findable, accessible, interoperable, and reusable. This can be achieved for example by automated recording, connection of devices, efficient ETL (Extract, Transform, Load) processes, careful data governance, and modern data security solutions. Enriched with clinical data, laboratory medicine data allow a gain in pathophysiological insights, can improve patient care, or they can be used to develop reference intervals for diagnostic purposes. Nevertheless, Big Data in laboratory medicine do not come without challenges: The growing number of analyses and data derived from them is a demanding task to be taken care of. Laboratory medicine experts are and will be needed to drive this development, take an active role in the ongoing digitalization, and provide guidance for their clinical colleagues engaging with the laboratory data in research.
ARTICLE | doi:10.20944/preprints201808.0320.v1
Subject: Environmental And Earth Sciences, Remote Sensing Keywords: re-usability; patterns; interoperability; geographic information systems
Online: 18 August 2018 (05:38:06 CEST)
Reuse of patterns is a self-evident approach for managing interoperability concerns. Although patterns for resolving interoperability barriers exist in the literature, no study exists on adoption of interoperability patterns by Geographic Information Systems (GIS) practitioners in industry. Thus there is limited understanding of pattern re-usability, yet the advantages offered by interoperability patterns provide a reasonably sound justification for their usage. This paper examines the adoption of proven interoperability best practices in the GIS industry. An empirical study that involved the use of semi-structured interviews was employed to gather data from GIS developers on domain interoperability best practices. Results indicated that industry and communities of practice have been converging on the technical level to ensure interoperability of GIS concerns. Semantic interoperability and related patterns are least understood, yet semantic barriers still exist. This is partly due to the complexity associated with the top-down approach used to develop semantic interoperability solutions. Therefore, this study proposes research into resolving barriers in the adoption of interoperability patterns that reduce complexity while solving semantic interoperability barriers.
ARTICLE | doi:10.20944/preprints201806.0488.v1
Subject: Computer Science And Mathematics, Applied Mathematics Keywords: gis; bim; ifc; citygml; integration; interoperability; geometry
Online: 29 June 2018 (15:15:57 CEST)
It is widely acknowledged that the integration of BIM and GIS data is a crucial step forward for future 3D city modelling, but most of the research conducted so far has covered only the semantic aspects of GIS-BIM integration. We present here the results of the GeoBIM project, in which we tackled three integration problems focussing instead on aspects involving geometry processing: (i) the automated processing of complex architectural IFC models, (ii) the integration of existing GIS subsoil data in BIM, and (iii) the georeferencing of BIM models for their use in GIS software. All the problems have been studied using real world models and existing datasets made and used by practitioners in the Netherlands. For each problem, we expose in detail the issues we faced, our proposed solutions, and our recommendations for a more successful integration.
ARTICLE | doi:10.20944/preprints201805.0075.v1
Subject: Computer Science And Mathematics, Information Systems Keywords: interoperability; IoT; connectivity; Industry 4.0; smart manufacturing
Online: 3 May 2018 (12:25:53 CEST)
This paper presents technical solutions to increase Industry 4.0 maturity. Within the “5G-Enabled Manufacturing” project, a 5G network has been deployed at the shop floor to enable fast and scalable connectivity. This network was used to connect a grinding machine to a private cloud to enable visibility and transparency of the production data, which is the basis for Industry 4.0 and smart manufacturing. Results indicate a present lack of commercially available product independent solutions for interconnection and data transfer despite the availability of open standards and well-documented demonstrator projects. The solution is described and discussed regarding technical interoperability, focusing on the system layout, communication standards, and open systems. From the discussion, it is derived that manufacturing end-users are in need to expand and further internalize knowledge on future information and communication technologies to reduce their dependency on equipment and technology providers.
REVIEW | doi:10.20944/preprints202311.0011.v1
Subject: Public Health And Healthcare, Public Health And Health Services Keywords: EHR; semantic interoperability; clinical information models; Archetype modeling
Online: 1 November 2023 (09:45:34 CET)
Semantic interoperability is a foundational concept in modern healthcare, enabling seamless communication and meaningful data exchange across diverse EHR systems. At the heart of this concept lies archetype clinical information models, which serve as standardized templates to structure and encode clinical data, ensuring consistency and shared understanding among various healthcare stakeholders. This review paper delves into the intricacies of Electronic Health Records (EHRs) within the context of semantic interoperability and the pivotal role played by archetype clinical information models. Drawing from a wealth of recent research, this review paper examines the theoretical underpinnings of semantic interoperability. By surveying the literature, it synthesizes the diverse approaches and methods employed to achieve semantic interoperability within EHR systems. Furthermore, this review does not shy away from addressing the significant challenges that persist in this arena. These challenges encompass issues related to data quality, governance, resistance to change, and the scalability and integration hurdles that healthcare organizations encounter in their quest for interoperable EHR systems. Through a comprehensive review of relevant research papers from esteemed sources including PubMed, ScienceDirect, and Taylor & Francis, the paper offers a comprehensive understanding of the approaches and methods employed in this domain. This analysis encompasses a time frame ranging from 2015 to 2023, ensuring an up-to-date assessment of the field. By presenting a well-rounded picture of the field, this review paper offers valuable insights for healthcare stakeholders, researchers, and practitioners. It underscores the enduring importance of semantic interoperability in EHR systems, not only in advancing patient care, clinical decision support, and healthcare operations but also in driving data-driven innovations in the dynamic and ever-evolving medical healthcare sector. This comprehensive examination sets the stage for future advancements in the pursuit of truly interoperable and data-driven healthcare systems.
ARTICLE | doi:10.20944/preprints202310.0659.v1
Subject: Computer Science And Mathematics, Computer Science Keywords: Digital Twin; Smart City; Interoperability; High-Level Architecture
Online: 11 October 2023 (03:32:37 CEST)
Infrastructure and urban network operators, city users and industrialists are faced with complex issues to ensure the sustainability of the service, maintain, operate and develop their urban systems, while integrating environmental, economic and societal impacts and being resilient to unexpected geopolitical and climatic upheavals. Digital Twins are now recognized as the cornerstone of digital transformation. In the field of smart cities, they can enable all stakeholders to collaborate across disciplinary silos and foster digital transformation in urban and territorial projects to ensure sustainability, resilience and increased inventiveness. However, the application of the Digital Twin technology to Smart Cities lacks of standards. This paper proposes a high-level architecture for Digital Twins as an enabler of various levels of interoperation between key stakeholders of a Smart City. Such a technology-agnostic architecture, when implemented for a given Smart City, produces a middleware for large scale interoperability between actors involved in multiple supply chains of this territory.
ARTICLE | doi:10.20944/preprints201910.0071.v1
Subject: Computer Science And Mathematics, Computer Science Keywords: Blockchain; Distributed Ledger Technology; Blockchain Trilemma; Scalability; Interoperability
Online: 7 October 2019 (12:24:44 CEST)
With the interest and attention that the Blockchain and the Distributed Ledger Technologies (DLT) have recently demanded, the technology is advancing at a very high rate. With investors and applications in a wide variety of fields, a lot of funding and efforts are being driven into bringing the technology to everyday use. The community and companies are coming up with new ways to collaborate, which makes the blockchain ecosystem evolve at full tilt. Consequently, this paper’s aim is to review the academic and grey literature and to provide readers with information about the evolution, benefits and challenges of the Public Distributed Ledger Technologies and to discuss the latest solutions, which are being developed for bringing decentralization closer to the mainstream. The paper reviews the Directed Acyclic Graph (DAG) structured distributed ledgers with focus on the Hedera Hashgraph, a novelty DLT bringing a unique consensus algorithm with new use-cases enabled by new cryptoeconomic mechanisms as well as vital services, such as Solidity smart contracts and distributed file storage. Lastly, we are going to explore second-layer network protocols, a major topic for solving the scalability issues, empower Bitcoin and DLTs and also disrupt cryptocurrency exchanges focusing on the Lightning Network.
ARTICLE | doi:10.20944/preprints202311.0104.v1
Subject: Public Health And Healthcare, Other Keywords: OMOP; OHDSI; interoperability; data harmonization; clinical data; claims data
Online: 2 November 2023 (07:45:02 CET)
To gain insight into the real-life care of patients in the healthcare system, data from hospital information systems and insurance systems are required. Consequently, linking clinical data with claims data is necessary. To ensure their syntactic and semantic interoperability, the Observational Medical Outcomes Partnership (OMOP) Common Data Model (CDM) was chosen. However, there is no detailed guide that would allow researchers to follow a consistent process for data harmonization. Thus, the aim of this paper is to conceptualize a generic data harmonization process for OMOP CDM. For this purpose, we conducted a literature review focusing on publications that address the harmonization of clinical or claims data in OMOP CDM. Subsequently, the process steps used and their chronological order were extracted for each included publication. The results were then compared to derive a generic sequence of the process steps. From 23 publications included, a generic data harmonization process for OMOP CDM was conceptualized, consisting of nine process steps: dataset specification, data profiling, vocabulary identification, coverage analysis of vocabularies, semantic mapping, structural mapping, extract-transform-load-process, qualitative and quantitative data quality analysis. This process can be used as a step-by-step guide to assist other researchers in harmonizing source data in OMOP CDM.
ARTICLE | doi:10.20944/preprints202202.0109.v1
Subject: Computer Science And Mathematics, Information Systems Keywords: Industry 4.0; industry 5.0 interoperability; Machine Learning; AI; HR; Attrition
Online: 8 February 2022 (12:31:01 CET)
This paper aims to raise awareness on certain interoperability issues as we intend to shape industry 5.0 in order to enable a human-centric resilient society. We advocate that the need of sharing small and specific data will become more intensive as AI-based solutions will become more pervasive. Consequently, dataspaces should be carefully designed to address this need. We advance the conversation by presenting a case study from HR demonstrating how to predict the possibility of an employee experiencing attrition. Our experimental results show that we need more than 500 samples for developing a machine learning model to be sufficiently capable to generalize the problem. Consequently, our experimental results show the feasibility of the idea. However, in small and medium sized companies this approach cannot be implemented due to the limited number of samples. At the same time, we advocate that this obstacle may be overcome if multiple companies will join a shared dataspace, thus raising interoperability issues
REVIEW | doi:10.20944/preprints202007.0123.v1
Subject: Computer Science And Mathematics, Mathematical And Computational Biology Keywords: causal interactions; databases; interoperability; biological pathway; logical modeling; computational biology
Online: 7 July 2020 (09:50:40 CEST)
Causal molecular interactions represent key building blocks used in computational modeling, where they facilitate the assembly of regulatory networks. These regulatory networks can then be used to predict biological and cellular behavior by system perturbations and in silico simulations. Today, broad sets of these interactions are being made available in a variety of biological knowledge resources. Moreover, different visions, based on distinct biological interests, have led to the development of multiple ways to describe and annotate causal molecular interactions. Therefore, data users can find it challenging to efficiently explore resources of causal interaction and to be aware of recorded contextual information that ensures valid use of the data. This manuscript presents a review of public resources collecting causal interactions and the different views they convey, together with a thorough description of the export formats established to store and retrieve these interactions. Our goal is to raise awareness amongst the targeted audience, i.e., logical modelers, but also any scientist interested in molecular causal interactions, about existing data resources and how to get familiar with them.
ARTICLE | doi:10.20944/preprints202001.0170.v1
Subject: Engineering, Control And Systems Engineering Keywords: internet of things; smart grids; protocol communication; interoperability; CoAP; OSGP
Online: 16 January 2020 (11:41:25 CET)
The evolution and miniaturization of the technologies for processing, storage, and communication have enabled computer systems to process a high volume of information and make decisions without human intervention. Within this context, several systems architectures and models have gained prominences, such as the Internet of Things (IoT) and Smart Grids (SGs). SGs use communication protocols to exchange information, among which the Open Smart Grid Protocol (OSGP) stands out. In contrast, this protocol does not have integration support with IoT systems that use some already consolidated communication protocols, such as the Constrained Application Protocol (CoAP). Thus, this work develops the integration of the protocols OSGP and CoAP to allow the communication between conventional IoT systems and systems dedicated to SGs. Results demonstrate the effectiveness of this integration, with the minimum impact on the flow of commands and data, making possible the use of the developed CoAP-OSGP Interface for Internet of Things (COIIoT).
REVIEW | doi:10.20944/preprints202304.0426.v2
Subject: Computer Science And Mathematics, Computer Science Keywords: Internet of Medical Things (IoMT); data exchange: healthcare; medical data; interoperability
Online: 5 June 2023 (08:12:36 CEST)
A medical entity (hospital, nursing home, rest home, revalidation center, etc.) usually includes a multitude of information systems that allow for quick decision-making close to the medical sensors. The Internet of Medical Things (IoMT) is an area of IoT that generates a lot of data of different natures (radio, CT scan, medical reports, medical sensor data). However, these systems need to share and exchange medical information in a seamless, timely, and efficient manner with systems that are either located within the same entity or located in other healthcare entities. The lack of inter and intra entity interoperability causes major problems in the analysis of patient records and leads to additional financial costs (e.g., redone examinations). In order to develop a medical data interoperability architecture model that will allow providers and different actors in the medical community to exchange patient summary information with other caregivers and partners in order to improve the quality of care, the level of data security, and the efficiency of care, we take stock of the state of knowledge.
ARTICLE | doi:10.20944/preprints202305.0145.v1
Subject: Computer Science And Mathematics, Computer Science Keywords: Semantic interoperability; clinical terminology; SNOMED CT; biomedical ontology; postcoordination; expression templates
Online: 3 May 2023 (11:27:53 CEST)
Expressive clinical terminologies are of utmost importance for achieving an semantically interoperable data exchange and reuse in healthcare. SNOMED CT, widely respected as the most comprehensive terminology in medicine, provides formal concept definitions based on description logic which not only allows for advanced querying of SNOMED CT-coded data but also for flexibly augmenting its 350,000 concepts. This ability for postcoordination largely increases the expressivity of the terminology but correlates with an intrinsic complexity. Complicated by the current lack of tooling support, postcoordination is widely either ignored or applied in an error-prone way. To help facilitate the adoption of postcoordination, we implemented a web application that guides through the creation of Postcoordinated expressions (PCEs) while ensuring the adherence to syntactic and semantic constraints. Our approach was largely facilitated by making use of the extensive SNOMED CT specifications as well as advanced HL7 FHIR Terminology Services. Qualitative evaluations confirmed the usability of the developed application and the correctness of PCEs created with it.
ARTICLE | doi:10.20944/preprints202205.0344.v1
Subject: Computer Science And Mathematics, Information Systems Keywords: Linked (open) Data; Semantic Interoperability; Data Mapping; Governmental Data; SPARQL; Ontologies
Online: 25 May 2022 (08:18:46 CEST)
In this paper, we present a method to map information regarding service activity provision residing in governmental portals across European Commission. In order to perform this, we used as a basis the enriched Greek e-GIF ontology, modeling concepts, and relations in one of the two data portals (i.e., Points of Single Contacts) examined, since relevant information on the second was not provided. Mapping consisted in transforming information appearing in governmental portals in RDF format (i.e., as Linked data), in order to be easily exchangeable. Mapping proved a tedious task, since description on how information is modeled in the second Point of Single Contact is not provided and must be extracted in a manual manner.
ARTICLE | doi:10.20944/preprints202105.0377.v1
Subject: Computer Science And Mathematics, Mathematical And Computational Biology Keywords: Sensor data, wireless body area network, wearable devices, sensor data interoperability
Online: 17 May 2021 (09:47:26 CEST)
The monitoring of maternal and child health, using wearable devices made with wireless sensor technologies, is expected to reduce maternal and child death rates. Wireless sensor technologies have been used in wireless sensor networks to enable the acquisition of data for monitoring machines, smart cities, transportation, asset tracking, and tracking of human activity. Applications based on wireless body area network (WBAN) have been used in healthcare for measuring and monitoring of patient health and activity through integration with wearable devices. Wireless sensors used in WBAN can be cost-effective, enable remote availability, and can be integrated with electronic health record (EHR) management systems. Interoperability of WBAN sensor data with other linked data has the potential to improve health for all, including maternal and child health through the improvement of data access, data quality and healthcare access. This paper presents a survey of the state-of-the-art techniques for managing WBAN sensor data interoperability. The findings in this study will provide reliable support to enable policymakers and health care providers to take action to enhance the use of e-health to improve maternal-child health and reduce the mortality rates of women and children.
Subject: Engineering, Control And Systems Engineering Keywords: Internet of Things(IoTs); Challenges; Test Strategies; Quality Assurance; Suggestion; Interoperability; Security
Online: 10 December 2019 (16:15:13 CET)
Immense challenges arise in the Quality Assurance area due to contemporary development in Internet of Things (IoT) technology. Current issues are mainly related to test coverage, test diversity, IoT Stability, Use of Cellular Networks in IoTs, IoT Devices updates, Security, Data Integration, and interoperability. In this paper, we present all those issues with suggestions for tackling those issues.
ARTICLE | doi:10.20944/preprints201805.0031.v1
Subject: Computer Science And Mathematics, Information Systems Keywords: SensorThings API; INSPIRE; download services; spatio-temporal data interoperability; Internet of Things
Online: 2 May 2018 (12:06:28 CEST)
ARTICLE | doi:10.20944/preprints202007.0243.v1
Subject: Engineering, Control And Systems Engineering Keywords: georeferencing; conversions; interoperability; CityGML; Industry Foundation Classes; Building Information Models; 3D city models; standards
Online: 11 July 2020 (16:26:37 CEST)
The integration of 3D city models with Building Information Models (BIM), abbreviated as GeoBIM, facilitates improved data support to several applications, e.g. 3D map updates, building permits issuing, detailed city analysis, infrastructure design, context-based building design, to name a few. To solve the integration, several issues need to be tackled and solved, i.e. harmonization of features, interoperability, format conversions, integration of procedures. The GeoBIM benchmark 2019, funded by ISPRS and EuroSDR, evaluated the state of implementation of tools addressing some of those issues. In particular, in the part of the benchmark described in this paper, the application of georeferencing to Industry Foundation Classes (IFC) models and making consistent conversions between 3D city models and BIM are investigated, considering the OGC CityGML and buildingSMART IFC as reference standards. In the benchmark, sample datasets in the two reference standards were provided. External volunteers were asked to describe and test georeferencing procedures for IFC models and conversion tools between CityGML and IFC. From the analysis of the delivered answers and processed datasets, it was possible to notice that while there are tools and procedures available to support georeferencing and data conversion, comprehensive definition of the requirements, clear rules to perform such two tasks, as well as solid technological solutions implementing them, are still lacking in functionalities. Those specific issues can be a sensible starting point for planning the next GeoBIM integration agendas.
REVIEW | doi:10.20944/preprints202304.0957.v1
Subject: Public Health And Healthcare, Health Policy And Services Keywords: electronic health record; European Union; health care data; health data space; review; semantic interoperability; standardization
Online: 26 April 2023 (04:41:03 CEST)
Semantic interoperability facilitates exchange and access of health data that is being documented in EHRs with various semantic features. In the EU, the regulation proposal of European health data space requires development of semantic interoperability. To achieve a fully integrated EHDS ecosystem leveraging the value of health data, stakeholders need to overcome challenges of implementing common standards and other semantic interoperability features. We aimed to research what scientific evidence is available on developing semantic interoperability. Our research questions focused specifically on key features and approaches for semantic interoperability and on possible benefits of these choices. For that purpose, we performed a systematic literature review by defining our study framework based on previous research. Our results consisted of 10 studies where data models, ontologies, terminologies, classifications, and standards were applied for building interoperability. Through increased access to interoperable patient information, a better quality and outcomes in care can be achieved. Better communication based on easily accessible data is facilitated between health professionals and between clinicians and the patients. When heading towards semantic harmonization as outlined in the EHDS proposal, more experiences and analyses are needed to assess how applicable the chosen solutions are for semantic interoperability of European health care data.
ARTICLE | doi:10.20944/preprints202106.0510.v1
Subject: Engineering, Automotive Engineering Keywords: Smart Energy Grids; Smart Substation Automation; Process Bus; Interoperability Standards; Industrial Communication Networks; TCP/IP
Online: 21 June 2021 (12:27:47 CEST)
The upgrade of energy infrastructure to those of smart grids, necessarily goes through the provision of integrated technological solutions that ensure the interoperability of their business capabilities and reduce the risk of devaluation of the systems used. The heterogeneity of the infrastructures and the dynamics of their operating environment, requires the continuous reduction of the complexity, the faster execution of the processes and the easy addition of innovative counterparts. Also, the integrated management of the overall ecosystem demands the provision of end-to-end interconnection, quality assurance, the definition of strict security policies, collaborative integration and correlation of events. In this respect, every design detail can be critical to the success or failure of a costly and ambitious project, such as that of smart energy networks. This work presents communication operating standards specific to the smart electricity networks applications, which should be taken into account in the process of planning and implementation of new infrastructures.
ARTICLE | doi:10.20944/preprints201901.0302.v1
Subject: Environmental And Earth Sciences, Remote Sensing Keywords: interoperability; digital elevation model; Google Sketchup; geographical information systems-science; free and open source software
Online: 30 January 2019 (05:28:53 CET)
Data creation is often the only way for researchers to produce basic geospatial information for the pursuit of more complex tasks and procedures such as those that lead to the production of new data for studies concerning river basins, slope morphodynamics, applied geomorphology and geology, urban and territorial planning, detailed studies, for example, in architecture and civil engineering, among others. This exercise results from a reflection where specific data processing tasks executed in Google Sketchup (Pro version, 2018) can be used in a context of interoperability with Geographical Information Systems (GIS) software. The focus is based on the production of contour lines and Digital Elevation Models (DEM) using an innovative sequence of tasks and procedures in both environments (GS and GIS). It starts in Google Sketchup (GS) graphic interface, with the selection of a satellite image referring to the study area—which can be anywhere on Earth's surface; subsequent processing steps lead to the production of elevation data at the selected scale and equidistance. This new data must be exported to GIS software in vector formats such as Autodesk Design Web format—DWG or Autodesk Drawing Exchange format—DXF. In this essay the option for the use of GIS Open Source Software (gvSIG and QGIS) was made. Correcting the original SHP by removing “data noise” that resulted from DXF file conversion permits the author to create new clean vector data in SHP format and, at a later stage, generate DEM data. This means that new elevation data becomes available, using simple but intuitive and interoperable procedures and techniques which confgures a costless work flow.
REVIEW | doi:10.20944/preprints202207.0022.v1
Subject: Engineering, Electrical And Electronic Engineering Keywords: blockchain; Edge/Fog computing; IIoT architectures; Industry 4.0; interoperability; low latency; reliability; scalability; security; Software-Defined Networking
Online: 1 July 2022 (17:11:41 CEST)
The Industrial Internet of Things (IIoT) is bringing evolution with remote monitoring, intelligent analytics, and control of industrial processes. A reference architecture provides the general layout information for the flexible integration of IIoT systems; however, as the industrial world is currently in its initial stage of adopting the full-stack development solutions with IIoT, some challenges need to be addressed. To cope with the rising challenges and provide the blueprint guidelines to develop and implement IIoT in real-time, researchers around the globe have proposed IIoT architectures based on different architectural layers and emerging technologies. In this paper, we first review and compare some widely accepted IIoT reference architectures and present a state-of-the-art review of conceptual and experimental IIoT architectures in literature. We highlight scalability, interoperability, security, privacy, reliability, and low latency as the main IIoT architectural requirements and compare how the current architectures address these challenges. We also highlight the role of emerging technologies in current IIoT architectures to address these requirements and present the literature gap for future research work to address the challenges.
REVIEW | doi:10.20944/preprints202309.1680.v1
Subject: Engineering, Architecture, Building And Construction Keywords: automated structural design; Building Information Modeling (BIM); design automation; generative design; interoperability; Structural Design Optimization (SDO); systematic framework
Online: 25 September 2023 (11:25:42 CEST)
Structural design optimization (SDO) plays a pivotal role in enhancing various aspects of construction projects, including design quality, cost-efficiency, safety, and structural reliability. Recent endeavors in academia and industry have sought to harness the potential of Building Information Modeling (BIM) and optimization algorithms to optimize SDO and improve design outcomes. This review paper aims to synthesize these efforts, shedding light on how SDO contributes to project coordination. Furthermore, the integration of sustainability considerations and the application of innovative technologies and optimization algorithms in SDO necessitate more interactive early-stage collaboration among project stakeholders. This study offers a comprehensive exploration of contemporary research in integrated SDO employing BIM and optimization algorithms. It commences with an exploratory investigation, employing both qualitative and quantitative analysis techniques following the PRISMA systematic review methodology. Subsequently, an open-ended opinion survey was conducted among construction industry professionals in Europe. This survey yields valuable insights into the coordination challenges and potential solutions arising from technological shifts and interoperability concerns associated with widespread SDO implementation. These preliminary steps of systematic review and industry survey furnish a robust knowledge foundation, enabling the proposal of an intelligent framework for automating early-stage sustainable structural design optimization (ESSDO) within the construction sector. The framework ESSDO addresses the challenges of fragmented collaboration between architects and structural engineers. This proposed framework seamlessly integrates with the BIM platform, i.e., Autodesk Revit for architects. It extracts crucial architectural data and transfers it to the structural design and analysis platform, i.e., Autodesk Robot Structural Analysis (RSA), for structural engineers via the visual programming tool Dynamo. Once the optimization occurs, optimal outcomes are visualized within BIM environments. This visualization elevates interactive collaborations between architects and engineers, facilitating automation throughout the workflow and smoother information exchange.
ARTICLE | doi:10.20944/preprints202206.0120.v1
Subject: Environmental And Earth Sciences, Environmental Science Keywords: Miscanthus; remote sensing; UAV; multispectral images; high-throughput phenotyping; machine learning; yield prediction; trait estimation; PROSAIL; multi-sensor interoperability
Online: 8 June 2022 (09:44:59 CEST)
Miscanthus holds a great potential in the frame of the bioeconomy and yield prediction can help improving Miscanthus logistic supply chain. Breeding programs in several countries are attempting to produce high-yielding Miscanthus hybrids better adapted to different climates and end-uses. Multispectral images acquired from unmanned aerial vehicles (UAVs) in Italy and in the UK in 2021 and 2022 were used to investigate the feasibility of high-throughput phenotyping (HTP) of novel Miscanthus hybrids for yield prediction and crop traits estimation. An intercalibration procedure was performed using simulated data from the PROSAIL model to link vegetation indices (VIs) derived from two different multispectral sensors. Random forest algorithm estimated with good accuracy yield traits (light interception, plant height, green leaf biomass and standing biomass) using VIs time series and predicted yield using peak descriptor derived from VIs time series with 2.3 Mg DM ha-1 of RMSE. The study demonstrates the potential of UAVs multispectral in HTP applications and in yield prediction for providing important information needed to increase sustainable biomass production.
ARTICLE | doi:10.20944/preprints202311.1613.v1
Subject: Computer Science And Mathematics, Other Keywords: Smart Data Models; Remote sensing; Satellite Imagery; Flood Monitoring and Mapping; Flood Risk Assessment; Data Sharing; Interoperability; Water Data Management
Online: 24 November 2023 (15:08:26 CET)
The increasing rate of adoption of innovative technological achievements along with the penetration of the Next Generation Internet (NGI) technologies and Artificial Intelligence (AI) in the water sector, are leading to a shift to a Water-Smart Society. New challenges have emerged in terms of data interoperability, sharing, and trustworthiness due to the rapidly increasing volume of heterogeneous data generated by multiple technologies. Hence, there is a need for efficient harmonisation and smart modeling of the data to foster advanced AI analytical processes which will lead to efficient water data management. The main objective of this work is to propose two Smart Data Models focusing on the modeling of the Satellite Imaginary data and the Flood Risk Assessment processes. The utilisation of those models reinforces the fusion and homogenisation of diverse information and data facilitating the adoption of AI technologies for flood mapping and monitoring. Furthermore, a holistic framework has been developed and evaluated via qualitative and quantitative performance indicators revealing the efficacy of the proposed models concerning the usage of the models in real cases. The framework is based on the well-known and compatible technologies on NGSI-LD standards which are customised and applicable easily to support the water data management processes effectively.
REVIEW | doi:10.20944/preprints202311.0244.v1
Subject: Engineering, Architecture, Building And Construction Keywords: Bi-directional Interoperability; Building Information Modelling (BIM); Construction 4.0; Digital Transformation; Digital Twin (DT); DT Advancements; DT Technologies; Holistic Review
Online: 3 November 2023 (11:04:37 CET)
Construction 4.0 is witnessing exponential growth in Digital Twin (DT) technology developments and applications, revolutionizing the adoption of Building Information Modelling (BIM) and other emerging technologies used throughout the lifecycle of the built environment. BIM provides technologies, procedures, and data schemas representing building components and systems. At the same time, DT enhances this with real-time data for cyber-physical integration, enabling live asset monitoring and better decision-making. Despite being in the early stages of development, DT applications have rapidly progressed in the AEC sector, resulting in a diverse literature landscape due to the various technologies and parameters involved in fully developing the DT technology. The intricate complexities inherent in digital twin advancements have confused professionals and researchers. This confusion arises from the nuanced distinctions between the two technologies, i.e., BIM and DT, causing a convergence that hinders realizing their potential. To address this confusion and lead to a swift development of DT technology, this study presents a holistic review of the existing research focusing on the critical components responsible for developing DT applications in the construction industry. The study identifies five crucial elements: technologies, maturity levels, data layers, enablers, and functionalities. Additionally, it identifies research gaps and proposes future avenues for streamlined DT developments and applications in the AEC sector. Future researchers and practitioners can target data integrity, integration and transmission, bi-directional interoperability, nontechnical factors, and data security to achieve mature digital twin applications for AEC practices. This study highlights the growing significance of DTs in construction and provides a foundation for further advancements in this field to harness its potential to transform built environment practices.
REVIEW | doi:10.20944/preprints202305.0105.v1
Subject: Computer Science And Mathematics, Artificial Intelligence And Machine Learning Keywords: Review; Human action recognition; Smart living; Multimodality; Real-time processing; Interoperability; Resource-constrained processing; Sensing technology; Machine learning; Deep learning; Signal processing; Smart home; Smart environment; Smart city; Smart Community; Ambient Assisted Living
Online: 3 May 2023 (06:54:40 CEST)
Smart living, a concept that has gained increasing attention in recent years, revolves around integrating advanced technologies in homes and cities to enhance the quality of life for citizens. Sensing and human action recognition are crucial aspects of this concept. Smart living applications span various domains, such as energy consumption, healthcare, transportation, and education, which greatly benefit from effective human action recognition. This field, originating from computer vision, seeks to recognize human actions and activities using not only visual data but also many other sensor modalities. This paper comprehensively reviews the literature on human action recognition in smart living environments, synthesizing the main contributions, challenges, and future research directions. This review selects five key domains: Sensing Technology, Multimodality, Real-time Processing, Interoperability, and Resource-Constrained Processing, as they encompass the critical aspects required for successfully deploying human action recognition in smart living. These domains highlight the essential role that sensing and human action recognition play in successfully developing and implementing smart living solutions. This paper serves as a valuable resource for researchers and practitioners seeking to explore further and advance the field of human action recognition in smart living.