Exploring Relational Agents for Different Healthcare Applications

Relational agents (RAs) are a special type of computer program or virtual entity designed to interact with humans in a way that simulates social interactions. These agents are equipped with artificial intelligence (AI) and natural language processing capabilities, allowing them to engage in conversations, interpret emotions, and respond with empathetic and contextually appropriate behaviors. They play a pivotal role in human-computer interaction, particularly in fields like healthcare, where personalized and compassionate communication is crucial.

In this research, different aspects and applications of RAs are explored in the domain of healthcare services. Our earlier explorations target the efficacy, acceptance, usability, and other basic measurements regarding RAs for healthcare services, particularly during COVID-19. Currently, we are investigating future opportunities for employing RAs in diverse healthcare applications, including gestational diabetes, different epidemics, health education, etc. Moreover, we are also working toward achieving universal health coverage (UHC) in Bangladesh by utilizing RAs that have the capability of interacting using Bangla languages. These research works are exclusively and jointly conducted with Data and Design Nest at the University of Louisiana at Lafayette, USA. Outcomes of this initiative have been published in ACM UIST 2022, ACM HAI 2021, IEEE ISCC 2023, JMIR Human Factors, IJERPH, PervasiveHealth 2021, and DESRIST 2021.

Human Activity Recognition and Rehabilitation Exercise Evaluation

This ICT Division and IUB funded project intends to develop machine learning-based approaches which can be used for the recognition of various activities using data from wearable sensors (e.g. accelerometer, gyroscope) and motion sensing devices (e.g. kinect). In addition, we intend to develop models which could be used by people seeking rehabilitation support at the CRP. Particularly, our research intention is to record movement data while the rehabilitation exercises are performed. In addition to sensor data, the plan includes collection of visual data of the patient performing exercises using 3D sensors. Finally, we intend to use time series data analysis to learn the activity, measure (or grade) the performance of the patient, level of improvement, identify the areas and time when the patient is facing difficulty. In order to achieve automatic or semi-automatic activity recognition and exercise evaluation, we have designed self attention and graph convolution based architectures. The works on self-attention based architectures for activity recognition from sensor data have been published in ECAI 2020 and PAKDD 2021.