Article
Version 1
Preserved in Portico This version is not peer-reviewed
A Robotic Context Query-processing Framework based on Spatio-temporal Context Ontology
Version 1
: Received: 31 August 2018 / Approved: 31 August 2018 / Online: 31 August 2018 (16:12:54 CEST)
A peer-reviewed article of this Preprint also exists.
Lee, S.; Kim, I. A Robotic Context Query-Processing Framework Based on Spatio-Temporal Context Ontology. Sensors 2018, 18, 3336. Lee, S.; Kim, I. A Robotic Context Query-Processing Framework Based on Spatio-Temporal Context Ontology. Sensors 2018, 18, 3336.
Abstract
Service robots operating in indoor environments should recognize dynamic changes from sensors, such as RGB-D camera, and recall the past context. Therefore, we propose a context query-processing framework, comprising spatio-temporal robotic context query language (ST-RCQL) and spatio-temporal robotic context query-processing system (ST-RCQP), for service robots. We designed them based on the spatio-temporal context ontology. ST-RCQL can query not only the current context knowledge but also the past. In addition, ST-RCQL includes a variety of time operators and time constants, and thus queries can be written very efficiently. The ST-RCQP is a query-processing system equipped with a perception handler, working memory, and backward reasoner for real-time query-processing. Moreover, ST-RCQP accelerates query-processing speed by building a spatio-temporal index in the working memory, where percepts are stored. Through various qualitative and quantitative experiments, we demonstrate the high efficiency and performance of the proposed context query-processing framework.
Keywords
intelligent service robot; robotic context query; context ontology
Subject
Computer Science and Mathematics, Robotics
Copyright: This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Comments (0)
We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.
Leave a public commentSend a private comment to the author(s)
* All users must log in before leaving a comment