Science-fiction writer William Gibson once wrote, “The future is here—it’s just unevenly distributed.” The same might be said for the cutting-edge world of precision medicine, with some clinical areas demonstrating remarkable progress in matching patients with appropriate treatments while others struggle, and the integration of data-driven medicine into routine practice remains frustratingly slow. On Wednesday afternoon, panelists at the “Heralding in a New Era of Precision Medicine” Super Session discussed some of the successes, setbacks and opportunities in this space.
Cancer care has been transformed by the recognition that tumors are best understood by identifying molecular features that can inform treatment. Melanie Nallicheri, Chief Business Officer at Foundation Medicine, described how her company’s genomic testing is guiding oncologists to better treatment selection for their patients, and to facilitate smarter drug testing. “We have sequenced 200,000 cancer patients – that is market-leading scale,” she said. “We can use that data for trial enrollment and for trial execution.”
In other indications, however, clinicians are still trying to break free from the big ‘bins’ historically used to classify complex disorders like heart disease. “By looking closely at clinical health records and genomics and marrying them together, we can see true subsets,” said Isaac Kohane, Chair of Biomedical Informatics at Harvard Medical School. “There is an opportunity to use these datasets and machine-learning to find these subsets, so clinical trials are not prone to failure.” In this context, real-world data culled from actual patients could lead to smarter disease classifications that could in turn enable better-designed trials.
The advent of electronic health records has been something of a mixed bag in terms of implementation. Although data about prescriptions and diagnostic tests are highly structured and organized, much of the data from a doctor’s examination is written out in a way that is computationally difficult to digest and analyze. Furthermore, the quality and level of detail generally fall well short of what would be seen in a clinical trial.
Neal Meropol, VP of Research Oncology at Flatiron Health, is relying on an army of experts to translate these kludgy documents into data computers can use. “To understand the richness of that information, it takes more than a computer today can do,” said Meropol. “It takes humans… what elements are high-quality and approximate those of a prospective clinical trial, and which are garbage.” Kohane disagreed, however, and felt that machine-learning methods are already beginning to outperform human curators.
More generally, there was considerable discussion about what artificial intelligence can and cannot do. Many of the most exciting developments are happening in imaging, as highlighted by Harvard cardiologist Scott Solomon. “We did a machine-learning project to look at whether the way the heart moved, the pattern, could help us phenotype patients better,” he said. “It was way better than all of the other things we knew about these people.” Even ‘dirty’ data can be astonishingly predictive. For example, Kohane described how poorly-structured emergency room discharge data could flag cases of domestic abuse before the authorities did—but he also noted that such promising demonstrations did not lead to real-world use at hospitals. “The C-suite had different priorities,” he said. “There has to be will.”
This is a time of transition, but the panelists were excited about the foundation being laid for a far more intelligent approach to medical practice. “I would love to see an algorithm that could take into account the totality of information in a patient’s record that predicts risk of a bad event or a tumor’s behavior,” said Meropol. “This is a problem that could be solved with math, but we’re not there today.”