Key Takeaways

This interview has been edited for length and clarity.

With the rise of digital technology within the care delivery context, clinical work hasn’t necessarily become easier. And care teams are feeling the effects. For example, according to a poll by Stanford Medicine, 71% of physicians felt the EHR contributes to more burnout.

As the healthcare landscape continues to digitize, how can technology developers help create more effective and responsible solutions?

In this episode of the Memora Health Care Delivery Podcast, our guest Graham Walker, MD, founder of MDCalc and Co-Director of Advanced Development at The Permanente Group, reflects on his work to digitally transform clinical work, the complexities of working in medicine and tech, and how he’s thinking of responsible AI use.

A lot of cumbersome tasks can be automated for clinicians

It’s difficult to imagine clinical environments without the EHR. Over 80% of office-based physicians use them. However, digitized forms of data collection by the bedside have only become commonplace in the past few decades. Before that, doctors and nurses used pen and paper to record information. 

The same analog process applied to studying medicine. Dr. Walker reminisces, “We would carry around these little quarter-sized books in our pockets. And they would have literally all of hospital medicine inside this little book. And some included scores. There were a handful of scores back in the day that kind of predicted good or bad outcomes of patients. People would ask you, ‘oh, how, what would the score be on this patient?’. And I was like, ‘why are you asking me to memorize this?’ ”

It was exactly the last question that spurred Dr. Walker’s interest in developing tools that simplify and automate routine tasks for care teams. He remarks, “I really felt like there's stuff that needs to be memorized in medical school. And then. As a tech guy, I definitely felt like, ‘why do I need to use neurons for this when there's literally the world's information on a desktop?’ There wasn't an iPhone yet, so you really, there really wasn't internet browsing on cell phones, but there were desktops everywhere and the world's information was available.”

Discover how one health system used AI to support its oncology patients.

A dual career in medicine and tech is rewarding as it is challenging

Apart from creating MDCalc, Dr. Walker is a practicing ED physician at The Permanent Medical Group. Though one would think balancing both worlds would be challenging, Dr. Walker explains, “I always came back to the fact that I can't do medicine as a side gig. Maybe that's changing a little bit now with more people doing consulting and part time work and stuff like that. But at the time, I definitely didn't feel like I could be a good doctor on the side.”

He adds, “I wanna be a good doctor. I enjoy it. But I feel like [medicine and tech are] complementary. Medicine has a really unique set of challenges for my brain and my body and my soul and my emotions. And then programming and tech stuff is a totally separate challenge that similarly activates me and gets me excited, but in a totally different way.”

Memora Health’s intelligent care enablement platform content is partially maintained by an in-house bench of experienced clinicians who strike a similar balance. This not only lends a rich clinical depth to our technology, but also helps ensure care team workflows are consistently considered.

Healthcare AI needs to be steered by a responsibility framework to avoid harm

One technological innovation widely touted to radically simplify — and even reduce — administrative tasks for clinicians is AI. And, for Dr. Walker, its potential to transform care is simultaneously exciting and worrisome.

He says, “ChatGPT launched, and I was just obsessed. I was showing everybody at work. It totally re-engaged me, and it made me both excited and a little scared for the future — as I think it did for everybody. I felt like I could see a bunch of different futures and where things could go. And I felt like I wanted to try to shape those futures … My worry [was] that this could potentially harm patients, but could also be really helpful for my colleagues, for the whole healthcare system.”

Driven to help guide AI toward safe and effective outcomes, Dr. Walker has been collaborating with physicians — both in the U.S. and from around the world — to establish a charter on responsible use of intelligent technologies. He explains, “I met a bunch of people from LinkedIn and we all just decided to make a collaborative Google Doc. We all saw ChatGPT and we thought, ‘Hey, what do we want the future to look like?’ We should tell people what we want the future to look like, again, for ourselves and for our patients, and for protecting the patient physician bond and relationship.”

Memora Health’s AI development and implementation processes are guided by a fundamental responsible AI framework. Specifically, our key tenets help ensure our platform is designed for safety and reliability, leverages human accountability and clinical oversight, promotes equity and fairness, and safeguards privacy and security.

Technology innovation will only accelerate in healthcare. By understanding how digital solutions have changed care delivery, the importance of clinical input in developing platforms, and responsible practices in AI development, the entire system can make more reliable decisions that have positive impacts on patients and clinicians. Ultimately, it’s up to healthcare leaders to steer their ships in the right direction so the care continuum as a whole can evolve.