Document Type
Conference Paper
Rights
Available under a Creative Commons Attribution Non-Commercial Share Alike 4.0 International Licence
Disciplines
Computer Sciences, Robotics and automatic control
Abstract
As ubiquitous computer and sensor systems become abundant, the potential for automatic identification and tracking of human behaviours becomes all the more evident. Annotating complex human behaviour datasets to achieve ground truth for supervised training can however be extremely labour-intensive, and error prone. One possible solution to this problem is activity discovery: the identification of activities in an unlabelled dataset by means of an unsupervised algorithm. This paper presents a novel approach to activity discovery that utilises deep learning based language production models to construct a hierarchical, tree-like structure over a sequential vector of sensor events. Our approach differs from previous work in that it explicitly aims to deal with interleaving (switching back and forth between between activities) in a principled manner, by utilising the long-term memory capabilities of a recurrent neural network cell. We present our approach and test it on a realistic dataset to evaluate its performance. Our results show the viability of the approach and that it shows promise for further investigation. We believe this is a useful direction to consider in accounting for the continually changing nature of behaviours.
DOI
https://doi.org/10.1007/978-3-030-45778-5_6
Recommended Citation
Rogers, E., Ross, R.J. & Kelleher, J.D. (2020). Language model co-occurrence linking for interleaved activity discovery. In Proceedings of the 34th International ECMS Conference on Modelling and Simulation, ECMS 2020, pp. 183--189. doi:10.1007/978-3-030-45778-5_6
Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-Share Alike 4.0 International License.
Publication Details
Appears in Proceedings of the 34th International ECMS Conference on Modelling and Simulation, ECMS 2020, pages 183--189