{"id":652,"date":"2025-07-15T20:08:54","date_gmt":"2025-07-15T20:08:54","guid":{"rendered":"https:\/\/ccds.ai\/?p=652"},"modified":"2025-08-10T18:15:09","modified_gmt":"2025-08-10T18:15:09","slug":"few-shot-human-activity-recognition-from-wearable-sensors","status":"publish","type":"post","link":"https:\/\/ccds.ai\/?p=652","title":{"rendered":"Few-Shot Human Activity Recognition from Wearable Sensors"},"content":{"rendered":"<div id='av_section_1'  class='avia-section av-md4yj5j2-56c9183728fd76a63563aae2d74db750 main_color avia-section-large avia-no-border-styling  avia-builder-el-0  avia-builder-el-no-sibling  avia-bg-style-scroll container_wrap fullsize'  ><div class='container av-section-cont-open' ><main  role=\"main\" itemprop=\"mainContentOfPage\"  class='template-page content  av-content-full alpha units'><div class='post-entry post-entry-type-page post-entry-652'><div class='entry-content-wrapper clearfix'>\n<section  class='av_textblock_section av-md4yk0b2-ab76dbbb3f094d0acc52708a4ba265fa'  itemscope=\"itemscope\" itemtype=\"https:\/\/schema.org\/BlogPosting\" itemprop=\"blogPost\" ><div class='avia_textblock'  itemprop=\"text\" ><p>We stand at the forefront of transforming remote healthcare by pioneering sensor-based human activity recognition (HAR). Our primary objective is to develop state-of-the-art ML models specifically designed for deployment on remote devices, enabling the continuous monitoring of patients and elderly individuals who require ongoing support. A significant challenge in this endeavor is the scarcity of labeled data for various activity classes, making training of traditional models difficult. To address this, we are actively working on solving the few-shot learning problem, so that our models can adapt with minimal labeled examples. This work builds up on our work on Self-attention based HAR and assessment of rehabilitation exercises using sensor data.<\/p>\n<p><b>Related publications<\/b><\/p>\n<ol>\n<li aria-level=\"1\">Hierarchical Self Attention Based Autoencoder for Open-Set Human Activity Recognition, 25th Pacific-Asia Conference on Knowledge Discovery and Data Mining (PAKDD-2021), Springer, May 11-14, 2021, Delhi, India. [<a href=\"https:\/\/web.archive.org\/web\/20241102091400\/https:\/\/arxiv.org\/abs\/2103.04279\" target=\"_blank\" rel=\"noopener\">arxiv<\/a>]<\/li>\n<li aria-level=\"1\">\u201cHuman Activity Recognition from Wearable Sensor Data using SelfAttention\u201d, in the proceedings of 24th European Conference on Artificial Intelligence (ECAI), Spain, 2020. [<a href=\"https:\/\/web.archive.org\/web\/20241102091400\/https:\/\/ecai2020.eu\/papers\/1109_paper.pdf\" target=\"_blank\" rel=\"noopener\">pdf<\/a>]<\/li>\n<li aria-level=\"1\">Assessment of Rehabilitation Exercises from Depth Sensor Data, International Conference on Computer and Information Technology (ICCIT), Dhaka, Bangladesh, December 18-20, 2021 [<a href=\"https:\/\/web.archive.org\/web\/20241102091400\/https:\/\/agencylab.github.io\/pdfs\/ThesisRehab.pdf\" target=\"_blank\" rel=\"noopener\">pdf<\/a>]<\/li>\n<li aria-level=\"1\">An Integrated System for Stroke Rehabilitation Exercise Assessment using Kinect v2 and Machine Learning, International Conference on Intelligent Human Computer Interaction, Proceedings of LNCS, Springer, Nov, 2023.<\/li>\n<\/ol>\n<\/div><\/section>\n\n","protected":false},"excerpt":{"rendered":"","protected":false},"author":2,"featured_media":0,"comment_status":"closed","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"_jetpack_memberships_contains_paid_content":false,"footnotes":""},"categories":[88],"tags":[],"class_list":["post-652","post","type-post","status-publish","format-standard","hentry","category-ai_ml_projects"],"acf":[],"jetpack_featured_media_url":"","jetpack_sharing_enabled":true,"_links":{"self":[{"href":"https:\/\/ccds.ai\/index.php?rest_route=\/wp\/v2\/posts\/652","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/ccds.ai\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/ccds.ai\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/ccds.ai\/index.php?rest_route=\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/ccds.ai\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=652"}],"version-history":[{"count":1,"href":"https:\/\/ccds.ai\/index.php?rest_route=\/wp\/v2\/posts\/652\/revisions"}],"predecessor-version":[{"id":653,"href":"https:\/\/ccds.ai\/index.php?rest_route=\/wp\/v2\/posts\/652\/revisions\/653"}],"wp:attachment":[{"href":"https:\/\/ccds.ai\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=652"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/ccds.ai\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=652"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/ccds.ai\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=652"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}