People’s daily life is increasingly intertwined with smart devices, which are more and more used in dynamic contexts. Therefore, searching and exploiting the wealth of information produced by the Internet of Things (IoT) requires novel models including a representation of the actual context of use. The definition of context is inherently difficult, due to the variety of application scenarios and user needs. In this paper, we propose a general model for devices’ contexts representing context components at different resolutions (or levels of granularity). This enables the definition of a multi-resolution context-based algorithm for querying the IoT, according to given preferences and contexts that can be tightened or relaxed depending on the given application goal. Experimental results show how the proposed approach outperforms traditional solutions by increasing the retrieval of relevant results while keeping precision under control.
Querying the IoT Using Multi-Resolution Contexts
C. Diamantini;D. Potena;E. Storti;D. Ursino
2021-01-01
Abstract
People’s daily life is increasingly intertwined with smart devices, which are more and more used in dynamic contexts. Therefore, searching and exploiting the wealth of information produced by the Internet of Things (IoT) requires novel models including a representation of the actual context of use. The definition of context is inherently difficult, due to the variety of application scenarios and user needs. In this paper, we propose a general model for devices’ contexts representing context components at different resolutions (or levels of granularity). This enables the definition of a multi-resolution context-based algorithm for querying the IoT, according to given preferences and contexts that can be tightened or relaxed depending on the given application goal. Experimental results show how the proposed approach outperforms traditional solutions by increasing the retrieval of relevant results while keeping precision under control.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.