CHA-CHA: Characterizing Human Activities for Cancer Health Awareness
Automated analysis of physical activity (such as 'walking', 'squats', or 'jumping jacks') and performance from video is lacking in the clinical, home monitoring and public health applications domain. Performance refers to quantitative measures of function such as walking speed or timing of a sit stand test. Health-oriented applications are poorly developed, limited to a few publicly available image management and annotation tools. While automated analysis tools do exist, they lack focus on health applications. For example, existing software tools emphasize counting and tracking customers, monitoring transportation behavior, and security concerns in the private and defense sectors. Cancer prevention and control could be improved with new tools for automated analysis, primary prevention such as improved evaluation of interventions to encourage physical activity, to enhanced epidemiological studies, to automation in monitoring of symptoms and response to treatment for disease affecting physical performance, to improved compliance with cancer treatments or physical rehabilitation regimens.
SIFT and Wake Forest School of Medicine created the Characterizing Human Activities for Cancer Health Awareness (CHA-CHA) system, which recognizes activities and activity performance, and then translates that information into a human-interpretable explanation. CHA-CHA allows patients to live their lives at home and unburdened while physicians gain access to activity and performance records key to monitoring their health state between visits. CHA-CHA recognizes activity and performance from video taken on a smart phone, sending reports and alerts directly to their physician. We used state-of-the-art AI techniques that combine Neural Networks and Symbolic Reasoning to gather and reason over symbolic information to recognize activities patients perform in video. Our SME at Wake Forest School of Medicine ensured activities and performance parameters are directly relevant to the cancer health domain and evaluated the information provided to physicians via a series of usability tests. Preliminary results show the classification accuracy of our system is comparable to state-of-the-art black box systems, yet the CHA-CHA system goes beyond them by providing human-readable explanations for both classification and performance deviation detection.