Please use this identifier to cite or link to this item:
http://ir.futminna.edu.ng:8080/jspui/handle/123456789/3096
Title: | ClusterNN: A Hybrid Classification Approach to Mobile Activity Recognition |
Authors: | Bashir, Sulaimon Doolan, Daniel |
Keywords: | Activity Recognition, KNN, Smartphones, ClusterNN |
Issue Date: | Dec-2015 |
Publisher: | Bashir, S., Doolan, D., & Petrovski, A. (2015, December). Clusternn: A hybrid classification approach to mobile activity recognition. In Proceedings of the 13th International Conference on Advances in Mobile Computing and Multimedia (pp. 263-267). |
Abstract: | Mobile activity recognition from sensor data is based on supervised learning algorithms. Many algorithms have been proposed for this task. One of such algorithms is the Knearest neighbour (KNN) algorithm. However, since KNN is an instance based algorithm its use in mobile activity recognition has been limited to offline evaluation on collected data. This is because for KNN to work well all the training instances must be kept in memory for similarity measurement with the test instance. This is however prohibitive for mobile environment. Therefore, we propose an unsupervised learning step that reduces the training set to a proportional size of the original dataset. The novel approach applies clustering to the dataset to obtain a set of micro clusters from which cluster characteristics are extracted for similarity measurement with new unseen data. These reduced representative sets can be used for classifying new instances using the nearest neighbour algorithm step on the mobile phone. Experimental evaluation of our proposed approach using real mobile activity recognition dataset shows improved result over the basic KNN algorithm. |
URI: | http://repository.futminna.edu.ng:8080/jspui/handle/123456789/3096 |
Appears in Collections: | Computer Science |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
222837979.pdf | 833.4 kB | Adobe PDF | View/Open |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.