Building People-Centric Technology in Healthcare

Siddhartha C
5 min readJun 15, 2018

[Disclaimer: The ideas published below are my own, and do not reflect the views of my employer.]

Last year ago, INSEAD ran an Alumni Forum titled — Business as a Force for Good. It made me think about the need for sustainable and responsible business models. There is a deeper discussion to be had about the social and capitalistic aspects of business value creation, but that’s discussion for another day. What I do strongly feel is that using the toolkits provided today — be they technologies or commercial business models — need to be inclusive. And I agree it’s easier said than done.

Tina Woods and Dr. Indra Joshi kindly invited me to a roundtable with various stakeholders looking to streamline the methodologies of adoption of AI in Healthcare, and how we could empower the existing ecosystem to build better people-centric healthcare models. But given the relatively small percentage of the people with deep expertise in both ML/AI and healthcare, we need a better framework to help both expand the knowledge base and innovate responsibly. When it comes to working in an ecosystem that affects lives directly, we need to remember that doctors have the Hippocratic oath for a reason.

Which brings us to a one of the often-asked questions — should there be an equivalent of a Hippocratic oath in the technology sphere? Albert Shum, in his recent blogpost, shared the Iron Ring Ritual followed in Canada, which shares a similar core to the Hippocratic Oath. Harry Shum and Brad Smith shared their views in Future Computed. At the end of the day, it comes down to being responsible. But being responsible to whom?

NHS has a rich history over the last 70 years, and today is one of the largest single-payer health system in the world. If we read the core principles of the NHS , they give us an insight into what the drivers of a socially-responsible system might look like :

· That it meet the needs of everyone

· That it be free at the point of delivery

· That it be based on clinical need, not ability to pay

What really caught my eye, was the 5th Principle, post-2011, is as follows:

The NHS works across organisational boundaries and in partnership with other organisations in the interest of patients, local communities and the wider population.

Working in partnerships with other organizations in the interest of patients, local communities and the wider population. This for me read as - collaborating to improve people-centric care, personal health, and population health. Organizations such as the Collider Health, are not only welcome, but are encouraged. We, who help contribute to the innovations in healthcare, are responsible to ourselves, our community and the wider population that we impact.

Given that we are in what’s one of the largest single-payer systems, we now need to look at how we could bring the Hospitals and the experts to collaborate, share knowledge, and learn from each other. How can we improve the trust across the board, to co-create a healthcare system for the future?

Expanding the Cross-Discplinary Ecosystems to drive people-centric healthcare outcomes

Whenever we create a new technology, we often fall into the trap of trying to use it to solve everything. When armed with a hammer, everything looks like a nail. The experienced craftsman knows where to use the hammer, and more importantly where not to.

I personally think, augmented with the knowledge shared in the conferences, roundtables and panels that I have been a part of, that we need to break the journey down into 3 steps:

1. UX / HCI / Design Thinking Approaches — to help us identify what are the problems that we face today, and how can we solve them. What are the approaches that we can take, and who would the subject matter experts be? What are the immediate test beds, and which ones need to be aggregated for future use?

2. AI/ML Readiness (for the lack of a better term) — this would primarily revolve around the regulation guiding the frameworks of the future, the taxonomy helping Healthcare providers, AI /ML Researchers and consumers understand each other, the data structures helping drive standardization not only to reduce integration overheads but also to help create transferable models and more robust anonymization and compliance practices, and of course ethical principles guiding what we can and cannot do — a conversation around patient consent. Regulation as an enabler, would not only allow reduce the uncertainty and ambiguity in the industry, but also provide a clear path for start-ups to augment innovation, increasing the choices, approaches and potential solutions available to improve healthcare (more on that later). Readiness would include both formal and informal learning paths to be co-created to drive cross-disciplinary skills and education.

3. AI/ML Adoption — once we actively start using the AI in live environments, how can the MHRA help us quantify the equivalent of ingredients and efficacy of the new treatment methodologies that we create? What are the data streams needed / available to drive a model, how effective is an ML/AI model in a clinical setting, how can models interact and interoperate with each other, what could be the best practices to write models/software in a healthcare ecosystem, templates and libraries for specific scenarios — that could be audited and accredited by the NHS, and potentially opened to use for start-ups/researchers, lowering the barriers to innovation.

Five years ago, I wanted to start a Population Health start-up in India — I talk about it here. I thought regulation is a blocker. But, if then, I had a process that I could follow to gain trust of the ecosystem, get validated / verified (similar to peer review when writing a paper), and potentially get access to limited datasets to show the efficacy of my approach — it would be a different story. Alok Mittal was one of the mentors, and he gave me some really sound advice — you need to work top-down to make innovations work. If we can create the right regulatory framework, I firmly believe that it can empower the start-ups and innovators of the future to play a larger part in improving healthcare outcomes.

[Updated to reflect grammatical corrections]

--

--

Siddhartha C

Health and Responsible AI @Microsoft | @INSEAD MBA 15D | MITxEntrepreneurship | TEDxINSEAD Crew | Ex-Social Entrepreneur