revised 2012 Core Project write-up using the 2013 template and instructions

2Core Projects

2.1Home and Community Health and Wellness (HCHW)

2.1.1Cueing Kitchen

ERC Team Members

Lead: Dan Ding (Pitt RST)

Faculty: Rory Cooper (Pitt RST), Martial Hebert (CMU RI), Annmarie Kelleher (Pitt RST),
Pam Toto (Pitt OT)

Postdoc:Kris Kitani (CMU RI)

PhD Students: Josh Telson (Pitt RST), Jing Wang (Pitt RST)

Undergrads: Shawn Hanna (Pitt CompE), Sossena Wood (PittBioE)

Project Goals and Impact

The goal of this project is knowledge that will inform the design of technologies that adaptively coach people with cognitive impairments through tasks of everyday living. The general concept is fora computer to perceivea person’sactions, predict if the he will failwithout some prompting, and if so,intervene with appropriate cues to guide him toward success. We have selected meal preparation and related kitchen tasks as a valuable representative set of applications. Our approach is a testbed: a fully functional kitchen augmented with instrumentation and a variety of computer-generated stimuli (cues), as well as computer-controlled actuators, for technology integration and evaluation in user studies.

Specific outputs of the project will include knowledge about:

  • the relative strengths and weaknesses of different types of cues (e.g., sounds, spoken text, projected graphics, etc.) for guiding users who have different forms and severity of cognitive deficits
  • the requisite fidelity of activity recognition techniques and prediction algorithms
  • how well a computer-based cueing system compares to a human

Our initial target populations are people who periodically require cognitive support, including people with traumatic brain injury (TBI), mild cognitive impairments, early stage Alzheimer’s Disease (AD), and other dementias. Their quality of life will be improved by virtue of their being able to independently and confidently perform IADLs. As such, the systems that we envision could improve other aspects of their lives such as hygiene, nutrition, general safety and even socialization. A secondary benefit will be reduced caregiver burden. This form of technology will figure in the national dialog about extending disabled veterans benefits to cover lost quality of life.

Role in Support of QoLT Strategic Plan

An environment that can understand, respond, and actively assist people with cognitive impairments to complete daily tasks exemplifies our vision for intelligent systems working symbiotically with people. It has potential value beyond that, for example, as an instructional tool or training aid. Cueing Kitchen is a testbed for evaluation and refinement of recognition, learning and prediction algorithms being developed throughout the Perception and Awareness Thrust, as well as emotion recognition techniques being developed in the Human-System Interaction Thrust. The Person and Society Thrust selected appropriate QoL and project-specific measurement instruments (<list>).

The project team has expertise in electrical engineering, rehabilitation engineering, computer vision, and occupational therapy. We have included clinicians and prospective end-users from the outset of the project.In early focus groups, we learned that system functionality is deemed valuable, but only if kitchen aesthetics can be preserved. Through a Wizard of Oz pilot study that included both TBI and AD patients, we learned that it would be wise to focus on a single population. We selected TBI patients because they are generally able to follow recipes, but it is often quite stressful for them.

Fundamental Research Barriers and How They Are Being Addressed

Our overall objective is to determine how cueing/prompting can be best used to support people with cognitive impairments execute a sequence of task steps. The project addressesseveral fundamental QoLT barriers.

  • “Individual Differences and Unpredictability” – The project aims to develop cueing systems that deliver cues if and only if they are needed, taking into consideration the nuances of cognitive impairments across people – even within the same disease population – and the day-to-day variability or progressive decline of a single user. We expect that cueing systems such as this will ultimately have to learn the idiosyncracies of each individual user. We are working closely with clinicians to understand how they provide prompts and cues to assist persons with different levels of cognitive abilities to complete tasks.
  • “User Acceptance”–People with different levels of cognitive impairments may respond to cues in different ways and different tasks may require different types of cues. Further it is not yet known if it is preferable to manually or automatically advance from one instruction to the next. The user study we are conducting will not only look into the effectiveness of different types of cues, but the usability and user preferences (end-users and caregivers).
  • ”Model Noise and Uncertainty” – For cueing to be welcomed and useful, it is crucial that the system accurately recognize tasks being performed and correctly predict if the user requires a prompt. Reliability is most likely the highest barrier to adoption.Cueing Kitchen is leveragingPerception and Awareness Thrust technologies: object identification from the Recognition project, activity recognition from the Learning and Grand Challenge projects, and application of First-Person Vision from the Sensing project. We are deliberately exploring multiple sensing approaches in parallel; an eventual more cost effective system would only including the elements that are really needed.
  • “Market Factors” –It is not yet clear how this technology will fit into reimbursement models of government and private insurers. Occupational/Vocational Rehab agencies would likely pay for similar technologies that coach work-related tasks and skills. At this stage of technology development we are more concerned with proving technical and clinical viability.

Over the past two years, this project has served as a platform for fiveResearch Experiences for Undergraduates program participants, three of whom are under-represented minorities.

Achievements

We have built a kitchen that can monitor users’ actions via low-level sensors (e.g., contact switches, current sensors,and RFIDs) and vision sensors.It candeliver different cues for task guidance including verbal or non-verbal audio; projected texts, picture/graphics, and video; and speciallighting (illuminated cabinet doors and handles) that does not compromise the aesthetics or add to the visual complexity of the space.

We completed three testing sessions to fine-tune our study protocol. The first session aimed to identify the specific subtasks of a simple meal preparation that require external cues to promote adequate task completion in people with cognitive impairments. When our occupational therapist deemed that the subject hadn’t performed any actions for a certain amount of time, or took an action that would prevent the subjectfrom accomplishing the end goal, she gave him a prompt. We recorded all the sensed events, which includedcabinets opening/closing and switching appliances on/off etc. and stored them in a database. The sensor data will be used to develop algorithms that can predict if a step is a 'prompt' step or 'no prompt' step..The second and third sessions aimed to evaluate the usability, usefulness, acceptance, and preference ofdifferent types of cues toassist persons with cognitive impairments tolocate items for a recipe or clean up the kitchen. We asked subjects to complete trials by following different types of cues (e.g., verbal, window transparency, light in the form of illuminated cabinets and handles, and projected image cues). We recorded the task completion time, and perceived the task loadand the user preference. The IRB approved the study protocol, and several mockup trials were conducted to improve the system robustness. Subject testing is scheduled to start in January 2012.

Relevant Work Being Conducted Elsewhere; How this Project Differs

Most groups that conduct research in this area focus on embedding technology in the living environment for monitoring and recognizing user behaviors [Franco 08; Lofti 11]. The ultimate goal of monitoring is to provide appropriate assistance to the resident. Previous work such as the development of the GUIDE system [O’Neil 08] and the COACH system [Mihailidis 08] has indicated that some types of prompting seem to be more effective than others. Some smart homes can provide task guidance by displaying task videos or giving audio instructions. There is no guideline available regarding how different types of cues encompassesboth the level of information and modality of presentation, and how they can be tailored to the individual’s level of cognitive, sensory, and physical impairments and the complexity of the task. In addition, Tsui and Yanco [Tsui 10] publisheda review paper on prompting devices for task sequencing, and pointed out that from the closed-loop user feedback is missing from all the surveyed devices, and future devices should consider user conditions and automatically provide minimal level of prompting to the user.Modayilet al. [Modayil 08] investigated how sensors can improve a portable reminder system that helps individuals accomplishtheir daily routines.Only the concept was presented and the example scenarios provided was focused only on using user locations to release cues more effectively.

Our work differs from the above works by focusing on embedding technology in the living environment for providing active assistance to the resident. We are interested in determining when and how to provide prompts/cues to guide users to independently complete sequenced common kitchen tasks. Providing cues at appropriate moments will increase the likelihood of correct responses from users and promote technology acceptance. Understanding how different forms of cues for task instructions are processed and interpreted by people with different levels of cognitive, sensory, and physical deficits will help guide future design of technology-embedded environments.

Future Plans, Deliverables and Milestones

The project is currently moving from QoLT-TRL3 (equivalent to an NSF “Early Prototype”) to QoLT-TRL4 (NSF “Developing Prototype”). Our intention is to take it through QoLT-TRL5 with ERC funding and through QoLT-TRL6 with other funding. As described in Project Goals and Impact above, the outputs of the project will be knowledge that is disseminated. Milestones for all projects are presented in Table QoLT-Strategy-3 in volume 1.

In the upcoming year, about 30 individuals withcognitive impairments and 10 clinical professionals will be tested through the previously mentioned protocol. The data will help us understand the relative value of different forms of cues on task guidance and create a preliminary hierarchy of cues that can be used to determine appropriate cues for people with different levels of cognitive impairments and different types of tasks. We will also work on extending the small kitchen setup to a full-sized kitchen in our new lab space. The full-size kitchen was built and we have started embeddingthesensors to monitor the opened/closed status of the cabinets and drawers, and the use of the stove, oven, and other smaller appliances. We have also started to developdifferent types of cuesto guide appliance use,and will continue to test subjects to compare conventional cueing based on task analysis and automatic cuingbased on sensed user actions.

Publication

Ding D, Cooper RA, Pasquina PF, and Fici-Pasquina L (2011). Sensor technology for smart homes, Maturitas, Vol. 69, No. 2, pp. 131-6.

Literature Cited

[Franco 08] Franco, G.C., Gallay F., Berenguer M., etc. (2008) “Non-invasive monitoring of the activities of daily living of the elderly people at home – a pilot study of the usage of domestic appliances.”Journal of Telemedicine and Telecare, 14(5): 231-235.

[Lofti 11] Lotfi, A., Langensiepen, C. Mahmoud, S.M. and Akhlaghinia, M.J. (2011) “Smart homes for the elderly dementia sufferers: identification and prediction of abnormal behavior.”Journal of Ambient Intelligence and Humanized Computing.

[Mihailidis 08] Mihailidis, A. Boger, J.N., Craig, T., and Hoey, J. (2008) The COACH prompting system to assist older adults with dementia through handwashing: an efficacy study.” BMC.Geriatr., 8(28).

[Modayil 08] Modayil, J., Levinson, R., Harman, C., Halper, D., Kautz, H.(2008) Integrating Sensing and Cueing for More Effective Activity Reminders, AAAI Fall 2008 Symposium on AI in Eldercare: New Solutions to Old Problems, Washington, DC, November 7 - 9

[O’Neil 08] O’Neil, B., and Gillespie, A. “Simulating naturalistic instruction: The case for a voice-mediated interface for assistive technology for cognition.” J Assist Technol, 2: 28-31.

[Tsui 10] Tsui, K.M. and Yanco, H.A., (2010) Prompting devices: a survey of memory aids for task sequencing, QoLT symposium.