Signals in the noise (Part 2 of 2)

Informatics challenges hinder progress in personalized medicine development

Randall C Willis
STORY PART 2 OF 2
CLICK HERE FOR PART 1
 
Tribulations fortrials
 
"One of the key causes of the downfall of the blockbusterdrug era is the fact that the artificial world of randomized controlled trials(RCTs)—where one attempts to remove all potential explainers, or confounders,of outcomes other than the presence or absence of the medicine being tested—isso far removed from the real world in which patients reside," says ThomasNeyarapally, senior vice president of corporate development at Cambridge,Mass.-based GNS Healthcare Inc. 
 
"A great deal of the massive investment in comparativeeffectiveness research suffers from a parallel infirmity—comparing Treatment Aversus Treatment B on average across an entire population, rather thanrecognizing that due to patient heterogeneity, the reality is typically that Ais better for some patients and B is better for other patients," he adds. "Thechallenge, which most traditional approaches are unable to address, is how toreliably identify which patients belong in which category."
 
 
In short, just because a drug passes the RCT hurdle, thereis no guarantee it will work in a real-world patient with several comorbiditieswho is taking several other drugs.
 
Brophy agrees and suggests that moving forward, clinicaltrials will have to become more adaptive or responsive to the data arising fromdiagnostic signatures.
 
 
"The example I can give is with our imaging molecules for,say, angiogenesis, where clinicians running a trial can expose patients to atherapeutic and watch the molecular pathways respond," he says.
 
 
This type of real-time feedback lends itself to adaptiveclinical trial design and would allow clinicians—and regulators—to haveincreased statistical confidence in the cohort they are treating.
 
 
"Let's look at some of the Alzheimer's therapeuticcandidates that recently failed to achieve their primary endpoints," Brophyoffers. "Subsequent analysis showed that many of the patients in the trialsshowed no signs of the amyloid buildup typical of Alzheimer's patients.
 
 
"In hindsight, to show a greater effect of treatment, youwould have wanted to identify the patients with increased amyloid, and thereare imaging and molecular tests that would have helped you do that," heconcludes.
 
 
Building betteroutcomes
 
 
For GNS, it's about building a better model of patientoutcomes. And from its previous incarnation as Gene Network Sciences, GNS knowssomething about model building.
 
 
"In its early days, GNS was an advanced systems biologycompany using patented mathematical and representational techniques to convertbiological relationships that were believed to be 'known' into computationalmodels of the systems being studied," Neyarapally recounts. "The key challengein this approach is that one is limited to the relationships that one alreadyknows—if, in fact, such relationships are actually valid—such that if only asmall fraction of the total biology needed to accurately model a given systemis known, then the resulting models from this approach may be of limited valuein drug and biomarker discovery."
 
 
Recognizing this issue, GNS decided the real opportunity wasin learning the probabilistic cause-and-effect relationships in data thatreflect the systems under study. This gave rise to the company'sreverse-engineering forward-simulation (REFS) system, a supercomputer-drivenplatform that enables the learning of such relationships, at scale, directlyfrom experimental or observational data.
 
"GNS works with real-world data such as medical and pharmacyclaims and EMR data, which can be licensed in bulk from vendors, to learnmodels that explain which types of patients are likely to benefit from whichtypes of treatments, given the rich history of those specific patients," heexplains. "Such models can then provide further context for clinical drugdevelopment."
 
 
Using these models, researchers can ask myriad "what-if"questions from their own experiences in the disease state of interest.
 
The REFS platform has definitely garnered interest, as in thelast three months, GNS has announced several pharma collaborations. WithBristol-Myers Squibb Co. (BMS), GNS is examining inflammatory disease ascharacterized by genetic markers, gene expression, blood markers and outcomes,and looking at how these parameters are impacted by BMS' anti-inflammatorycandidates and identifying potential new therapeutic targets.
 
A second collaboration is with Princeton, N.J.-basedcontract research organization Covance, where data generated from its extensiveclinical and scientific support of pharmaceutical drug development will beanalyzed by the REFS platform to build models that take in information about aproposed clinical candidate and output the likelihood of success in developingthat candidate.
 
 
"Although the first project in this collaboration istargeted at diabetes medicines, the same methodology can be utilized across alltherapeutic classes," offers Neyarapally.

System flexibility is key, as a truly comprehensiveinformatics solution should be able to accept inputs from any number ofmodalities and digest that information so that it can be fed back into theclinical trial process.

Beyond the mundane—but not trivial—challenges of aggregatingand normalizing data from disparate data modalities, and further challengesrelating to disparate time scales associated with measurement of differenttypes, says Brophy, the challenge becomes an opportunity when one appliescomputational platforms that flexibly integrate data from multiple modalitiesand the element of time into one model of the system of interest.
 
"The use of multiple data modalities together to answer keyclinical questions remains a large, missed opportunity across healthcare," saysNeyarapally. "The reality is that each of these modalities has somethingdifferent and important to say about the state of a patient."
 
 
At the same time, it is also important to remember thatresearchers are dealing with clinicians whose first priorities are the patientand not necessarily the clinical trial.
 
"From the perspective of the physician or nurse, we want tohelp them incorporate research samples and results into patient treatment aspainlessly as possible," Meek adds. "Because their focus is on the patient, theresearch project is a secondary priority. The interface has to be simple, andwe need to constantly prioritize the tests for the samples that are available." 


Meek offers the vivid example of a nurse whose patient isvomiting. The nurse may only have been able to get two blood draws—her focus isand should be on helping her vomiting patient. Thus, testing on those drawsneeds to be prioritized to ensure the researcher gets as much information as hecan, given the limitations of clinical realities.
 
 
"The whole point is to create a feedback loop so thatclinicians not only provide samples to a research project, but also gleanactionable information from those samples," she says. "If the lab discoversthat a patient is showing an adverse event during treatment, we need to be ableto feed that information back to the patient through the doctor, so the lattercan act on those findings."
 
 
More decisivedecisions
 
"We see what we're doing as democratizing interpretation ofmolecular data," says Brophy. "How can we tee this data up to make it usefulfor clinicians in decision-making? It's not just a matter of putting the datanext to each other, but rather it's about the integration of the data, which iswhy informatics is so important.
 
"As we look at the combination of diagnostic modalities, wehave to think about outputs from the clinician's perspective," he adds. "As theyconsider an individual patient, how would they like to see that patient'sdata?"
 
 
Brophy offers an example from neurology and suggests usefuldata presentation could be as simple as overlaying an image from a PET scan,which shows brain activity, with an MRI scan, which shows brain tissue atrophy.
 
 
"We do a lot of work on the individual modalities and thenrely on informatics to combine the data arising from these modalities in auseful form," he says.
 
In a 2011 webinar hosted by the BioIT Alliance, JeffPennington of the Children's Hospital of Philadelphia (CHOP) talked about hisgroup's efforts with Thermo Fisher Scientific to introduce and seamlesslyintegrate new informatics solutions. Pennington, who is director oftranslational informatics at the CHOP Center for Biomedical Informatics,divides the efforts into four quadrants—biospecimen management, clinicalstudies, molecular analysis and patient-facing—and tries to find solutions thatwill allow for smooth data exchange across all of these quadrants.
 
"We're relentlessly focused on making sure these data areaccessible to researchers in as user-friendly a manner as possible," he offers."Our approach is to find a friendly collaborator who has a canonical use-casefor a tool that fits into this four-dimensional framework. We then work withthe collaborator, typically on a grant-funded basis, to pilot a best-of-breedsolution that works for them and has the potential to generalize to otherinvestigators."
 
 
"[CHOP] wanted to support a collaborative work environmentwith centralized administration giving researchers the benefit of a LIMSwithout requiring them to be system administrators," adds Meek, who alsoparticipated in the webinar.
 
 
Through this collaboration, they then figure out the socialinformatics—the education and organizational change required by theintroduction of any new technology—and choose systems that are open andimplement web service APIs enabling integration into the larger framework,Pennington says.
 
 
For him, the social informatics side is key to success.
 
"The introduction of a new technology often means thatpeople have to change how they work and people have to learn and adopt newskill sets. The organizational change required for a successful implementationof any of the tools I described is probably the biggest challenge," he says. "Idon't want to underplay the technical work that we do, but really, thatorganizational change is the biggest challenge, and it's something that can'tbe approached lightly. It requires time and a really collaborative approach toworking directly with researchers in a highly iterative manner so that theyfeel that their concerns are being addressed and that they're really involvedin the process."
 
 
Others agree and are making efforts to ensure clinicians arenot being run over by the influx of information.
 
 
In October, San Francisco's CollabRx announced a multiyearpartnership with Life Technologies Inc. of Carlsbad, Calif., to develop andcommercialize its healthcare decision-making analytics and content resourceswith Life Technologies' global cancer diagnostics and services business.Essentially, the collaboration will add a routinely updated, contextualizinginformation layer to Life Technologies systems.
 
 
"Molecular analysis, including genetic sequencing, isincreasingly becoming an important part of the clinical management of cancerpatients," said James Karis, CollabRx's co-CEO, in announcing the agreement."However, the sheer volume and complexity of genetic data that is beingproduced, particularly in the course of therapy development, is outpacing theability of practicing physicians to stay current, and more importantly, tounderstand how to apply this genetic data in treating their patients."
  
"It's critical to contextualize the results of complexcancer panels to make them useful for interpreting and treating physicians,"added Life Technologies Life Sciences President Ronnie Andrews. "CollabRx haspioneered the development of a scalable platform and process to provideactionable, accessible and credible knowledge at the point of care to aidphysicians in developing a cancer treatment plan based on tumor molecularprofiles."
 
He who pays …
 
 
The correlation and analysis of real-world patientinformation with molecular and clinical diagnostics isn't just garnering theattention of pharma and diagnostic companies. When it comes to ensuring theright patient gets the most effective drug with the lowest risks, the insuranceindustry also has a significant vested interest.
 
 
"Aetna and other payer groups are coming to GNS to go beyondpopulation-level analyses and to characterize, using their available data,where each patient is in the continuum of disease and interventions, wherethose patients are likely to be in the future and which interventions arelikely to positively deflect that trajectory for each patient," offersNeyarapally.
 
In September, GNS and Aetna signed an agreement to leveragethe insurer's customer data to help model member risk and identify whatinterventions make the most sense before more severe health issues arise.
 
It's really a two-way street, Neyarapally explains.Biomarkers of efficacy are of value to payers in helping them understand whichpatients should get which medicines. Likewise, analysis of payers' claims datacan help pharma companies better understand what patient subpopulations areunderserved by existing therapies.
 
"We hear recognition in discussions with some of our pharmapartners that the implication for pharma of the emergence of these capabilitiesis that pharma ought to either 'get ahead of the curve' in understanding whatworks for whom at the individual level and allowing that understanding toinform their drug development and product positioning efforts, or that the sameinformation will be obtained and utilized by payers and patients to thepharmas' significant detriment," he says. 
 
Regardless of what stage of personalized medicinedevelopment, methods to decode the "information Babel" arising from clinicalresearch and clinical practice are going to be key.
 

Randall C Willis

Subscribe to Newsletter
Subscribe to our eNewsletters

Stay connected with all of the latest from Drug Discovery News.

November 2022 Issue Front Cover

Latest Issue  

• Volume 18 • Issue 11 • November 2022

November 2022

November 2022