Subscribe / Unsubscribe Enewsletters | Login | Register

Pencil Banner

Do no harm: an oath for health IT developers

Kacy Zurkus | Aug. 2, 2016
The risks for enterprises when security in health IT lags behind

Requiring software engineers to take the developers' equivalent of the Hippocratic oath, said Harrington, "Would realign their priorities to patient health. On time delivery, hitting 'go to market' timelines, cost considerations. These are all business decisions related to the development of that solution."

Developers need to be cognizant of those things, but the development practices should be considered with an awareness that what they are building could impact patient health.

The risks to patient health, explained Michael Borohovski, CTO and co-founder, Tinfoil Security, extent beyond actually causing the patient harm or pain.

"Imagine for a moment that there was a test for pancreatic cancer, wasn’t well test and the false negative rate was pretty high. 50/50 right/wrong. If that were the case and patients rely on it, now they go for another year potentially living with cancer not knowing that they have it. Not actively harming a patient by being mistaken on diagnosis or testing.

Mistakes that are made due to speed with a primary focus on rushing to market, particularly with the study of human genomes, can have serious damages to patients, but the business goal for developers is make a profit in addition to helping people.

"The Hippocratic oath might be a bit of a stretch. It’s a little different in that doctors are exclusively there to help patients. They don’t have a duty to share holders. Their duty is to shareholders not to the patients or to the people whose data they store. Implicit in that duty to shareholders there is the responsibility to find and patch vulnerabilities," Borohovski said.

What needs to change, then, is the culture around security. Given that no software can ever have 100% security, "Companies need to adopt a culture of responding to security vulnerabilities quickly and with vengeance," he continued. 

The current culture and restrictions on security researchers, Borohovski said, "Don't incentivize researchers to be ethical. Reporting a vulnerability could get you thrown in jail." 

For developers that are working with sensitive data or storing sensitive data, it behooves them to do everything they can to find vulnerabilities. "Redefining the culture to make it easier to report will allow researchers to make more concerted efforts to find vulnerabilities," Borohovski said.

Calling for a change in culture as opposed to holding developers to a higher ethical standard might be an easy scapegoat, though. 

Grant Elliott, founder and CEO of Ostendio said, "We would simply be happy for them to meet general industry standards. Healthcare as an industry is significantly behind. The imperative or incentive to try and meet these basic security requirements doesn’t seem to be as urgent for many reasons."

 In the health care industry, the correlation of security risks are not as clear as they are in other sectors, like retail or banking. People know of the Target breach, so they can avoid shopping at Target, there is an obvious bottom line impact, said Elliott.


Previous Page  1  2  3  Next Page 

Sign up for Computerworld eNewsletters.