Subscribe / Unsubscribe Enewsletters | Login | Register

Pencil Banner

6 Innovations That Will Change Healthcare

Brian Eastwood | Feb. 19, 2013
When economists, data scientists and medical professionals team up, the result is often remarkable innovation. These six examples from the Massachusetts Institute of Technology's Future of Health and Wellness Conference could change the way patients interact with hospitals, physicians and each other.

5. Emotion Sensors: For the Willing, Anything Can Be Monitored

When you're hard at work, deep in thought or emotionally stimulated, your brain sends signals to your skin, especially your hands and feet. This is called electrodermal activity, and it's triggered by the sympathetic nervous system as part of our innate "fight or flight" response system. Measuring this activity used to require wires and sensors, which had its shortcomings-namely, wearers couldn't wash their hands. Advances in sensor technology now allow for noninvasive monitoring all day and night.

Much of the work done by Rosalind Picard, founder and director of the MIT Affective Computing Research Group, concerns children with autism. Emotion sensors such as the Q Sensor (made by Affectiva, which Picard co-founded) can indicate when a child is about to act out or has calmed down.

The results can dramatically alter treatment plans. In a specific example, Picard was with a former student with autism wearing the sensors and waiting to deliver a delayed speech; the student was pacing as a means of calming herself, but the student's friend told her to stop, saying it didn't help. In analyzing her electrodermal activity after the fact, Picard and the student determined that pacing did, in fact, help-and the next time she was preparing for a speech, her friend let her pace.

Sensors don't even need to be worn. The Cardiocam, also an Affective Computing initiative, can measure heart rate and breathing rate through a webcam, while Affectiva's Affdex reads emotion through a webcam. Both offer opportunities for telemedicine and remote patient monitoring.

6. Wellness Counseling: Sometimes, People Like Talking to Computers

If patients are being monitored by computers, then having computers talk to them isn't much of a stretch-and, notes Timothy Bickmore, professor of computer and information science at Northeastern University, patients are receptive to the idea.

Bickmore's lab, the Relational Agents Group, has run 12 clinical trials involving roughly 2,500 patients. These trials have encompassed exercise promotion (both for older, low-literacy patients and for Parkinson's patients), a patient portal for promoting follow-up care, collecting family medical history and explaining discharge summaries to hospital patients. In all cases, patients interacting with an animated character, or relational agent, within their program were more active participants than those not working with an agent.

The agents themselves use a dialogue engine that's based on a branched hierarchical transition network that contains several thousand nodes in all, and that uses XML to denote synchronized nonverbal behavior such as vocalized pauses, head nods and hand gestures. That nonverbal behavior came from watching patient-provider interactions to determine when, for example, a nurse points to a patient's discharge summary and what she's pointing to. Patients don't talk back; instead, they use a touch screen to choose an answer to the agent's question and await a response.

 

Previous Page  1  2  3  4  5  Next Page 

Sign up for Computerworld eNewsletters.