Preprint Article Version 2 Preserved in Portico This version is not peer-reviewed

Serverless GEO Labels for the Semantic Sensor Web

Version 1 : Received: 21 February 2020 / Approved: 23 February 2020 / Online: 23 February 2020 (13:51:13 CET)
Version 2 : Received: 29 June 2020 / Approved: 30 June 2020 / Online: 30 June 2020 (08:06:39 CEST)

How to cite: Graupner, A.; Nüst, D. Serverless GEO Labels for the Semantic Sensor Web. Preprints 2020, 2020020326. https://doi.org/10.20944/preprints202002.0326.v2 Graupner, A.; Nüst, D. Serverless GEO Labels for the Semantic Sensor Web. Preprints 2020, 2020020326. https://doi.org/10.20944/preprints202002.0326.v2

Abstract

As the amount of sensor data made available online increases, it becomes more difficult for users to identify useful datasets. Semantic web technologies improve discovery with meaningful ontologies, but the decision of suitability remains with the users. The GEO label provides a visual summary of the standardised metadata to aid users in this process. This work presents novel rules for deriving the information for the GEO label's multiple facets, such as user feedback or quality information, based on the Semantic Sensor Network Ontology and related ontologies. It enhances an existing implementation of the GEO label API to generate labels for resources of the Semantic Sensor Web. The prototype is deployed to serverless cloud infrastructures. We find that serverless GEO label generation is capable of handling two evaluation scenarios for concurrent users and burst generation. More real-world semantic sensor descriptions and an integration into large scale discovery platforms are needed to develop the presented solutions further.

Keywords

GEO label; serverless; semantic sensor web; discovery; visualisation; sensor web

Subject

Environmental and Earth Sciences, Remote Sensing

Comments (1)

Comment 1
Received: 30 June 2020
Commenter: Daniel Nüst
Commenter's Conflict of Interests: Author
Comment: Update after first reviewer comments: add discussion section, update figures.
+ Respond to this comment

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 1
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.