The definition of “therapeutic imperative” is this: A treatment necessity that must be met in order to prevent death. Once the imperative is clearly defined, addressing it properly links correct diagnosis with therapy and favorable prognosis. On the other hand, misdiagnosis or inadequate diagnosis fails to provide a framework for highly effective or curative therapy.
Historically, the real therapeutic imperatives of cancer management have been scarcely known, leaving only limited benefits from treatment in the face of certain demise. But technology brings benefits.
Today, the modern multiomic analysis of cancer removes the molecular blindfold with payoffs for diagnosis and disease management akin to the development of the microscope and the arrival of high-definition imaging. By extension, practicing without molecular diagnosis will soon be unimaginable, comparable to diagnosing cancer without a microscope or three-dimensional radiological imaging.
Genomics transformed some disease outcomes
Since the discovery of DNA structure in 1952, the completion of sequencing the first human genome in 2003, and successful engineering and commercialization of large-panel next generation DNA sequencing during the last 20 years, many therapeutic imperatives in oncology have been addressed or are in the process of being addressed.
For example, almost no one dies of oncogene-driven cancers that were once universally fatal, such as chronic myeloid leukemia or acute promyelocytic leukemia. Lethal diseases like mantle cell lymphoma, metastatic melanoma, and non-small cell lung cancer have something that was once unheard of—improving and significant five-year survival rates.
For example, meeting the therapeutic imperative of MCL with BTK inhibitors or NSCLC due to EGFR Exon 19 deletion with an EGFR inhibitor causes these diseases to utterly collapse (in most cases).
Such examples show how molecular diagnosis provided sufficient diagnostic clarity to identify and meet the therapeutic imperatives in each instance.
Yet, most solid cancers are not simple oncogene-driven but rather complex network diseases possessing many resources fueled by genomic entropy, a non-conservative deterioration of the human genome which haphazardly generates daughter cells with new resources.
Ultimately, these resources undermine treatment and confer the survival of the fittest phenomenon. As a result, most metastatic complex solid tumor malignancies remain fatal diseases. Their therapeutic imperatives go undiagnosed, leaving a relentless malignant phenotype to generate therapeutic resistance against overly simplistic or, for many patients, irrelevant therapeutic approaches.
Such are the consequences of a complex adaptive network that is continually evolving resistance (CANCER). For these complex cancers, superficial molecular diagnosis is inadequate diagnosis. And “comprehensive molecular diagnosis” is something altogether different than the 300 to 600-gene panels whose evolution are frozen in time by regulatory approval, the anti-innovative and regressive consequence of payor approval.
At the same time, the limit in panel size tuned down to what oncologists can reasonably manage may no longer be necessary in an age of computer-assisted diagnostic tools and procedures.
The complexity of most cancers is represented by acquisition of many mutations and/or a plethora of chromosomal copy number changes that compromise the allelic number of tumor suppressor genes that protect against cancer or augment the allelic number of oncogenes that facilitate dysregulated growth.
Though we have the capacity to measure most genomic aberrations, there is an unabridged chasm between the enumeration of dozens, hundreds, or even thousands of changes in an individual patient’s cancer and an understanding of the functional organization of the malignant network and the biological resources it provides.
The chasm between mere measurement and diagnostic insight is analogous to the discovery of optics and the technology for creating lenses compared to the invention of the microscope where the revolution of optics was harnessed into practice.
How exactly can we understand the functional organization of all the genomic aberrations in a cancer and translate that into meaningful therapy?
Computational biosimulation uncovers cancer networks
Enter computational biological modeling, or “biosimulation.” Biosimulation traces molecular species through their cellular fate encompassing transcription, translation, cellular location, and post-translational modifications. Additionally, biosimulation provides in silico modeling of the canonical and non-canonical protein-protein interactions to represent signaling pathway cascades, the biological functions they mediate, and their convergence on key transcription factors that determine cell fate, i.e., the master regulators.
In this manner, biosimulation creates a molecular digital double that reveals the consequences of gene expression and protein function that shape the cell’s capacity to achieve homeostasis, the hallmark behaviors of cancer, and robust adaptation under stress, including therapeutic stress.
In short, biosimulation represents a new technology that can delineate the cancer network and its key nodes, identifying its mechanistic dependencies, therapeutic vulnerabilities, and mechanisms of resistance.
Not surprisingly, there is variation in the network determinants of patients who ostensibly have the same disease. This variation furnishes differential response to individual drugs and therapeutic strategies. But the network variation among cancers is not as profound as the genomic variability recognized by the N-of-1 conundrum—namely, if everyone’s cancer is different, then hypothesis testing borders on the impossible.
By comparison, diagnosing the underlying cancer network amounts to identifying its key nodes characterized by oncogenes, master regulators, and synthetic lethal vulnerabilities which commonly overlap among patients with divergent genomics.
From the vantage point afforded by biosimulation, the unlimited number of potential genomic aberrations and combinations filters down to a tractable level of variation which supports hypothesis testing and clinical research.
Most remarkably, diagnosing the cancer network’s key nodes removes the molecular blindfold, thus permitting the design of combination therapies to target cancer’s key dependencies and vulnerabilities.
On one hand, biosimulation-informed treatment allows physicians to avoid treatments that have no chance of efficacy. More profoundly, the technology identifies combination therapy that can overcome the limitations of under-targeting a complex network that is prone to adapt to simplistic approaches.
Unlike the alphabet soup that has characterized many combination therapy approaches of the past, which are based on combining drugs with different mechanisms of action and non-overlapping toxicities, biosimulation affords the possibility of identifying and exploiting the relevant Achilles’ heels in an individual’s cancer.
By recognizing the therapeutic imperatives in a given network, it becomes possible to surpass the naïve one-drug-at-a-time approach or molecularly blind often irrelevant combinations.
Biosimulation is fundamentally patient-centric and presents a bespoke method of tailoring therapy to the uniqueness of an individual’s cancer network. This contrasts to the protocol-centric approaches that derive evidence from population-based methods.
The weakness of evidence-based medicine is that its generalized conclusions may be irrelevant for individuals alleged to have the same diagnosis, but whose disease has a molecular basis that does not fit the treatment.
In every subset analysis, when it comes to complex and highly resourced cancers, the assumption of sameness for clinical trials participation turns out to be weak, and often wrong, for a majority of participants.
This makes for the familiar grim retrospective of most cancer therapies that were inaugurated with terrific hope and terrible naivete but which afford only marginal improvements. By comparison, molecular diagnosis and biosimulation embrace heterogeneity in the patient population and seek to identify the imperatives and opportunities to defeat the disease upfront.
Limitations of cancer modeling
Of course, all models have limitations. One of the limitations concerns the inputs to the model. Small gene panels that are commercially popular and have regulatory approval only address a diminutive fraction of the 800 oncogenes and 1,200 tumor suppressor genes that have credentials in cancer biology, and these are but a fraction of the 22,000 genes that in one way or another are subsumed in the malignant process.
Though it is possible to identify the expression level of all genes compared to a normal cell, such complete annotation is not reported leaving the transcriptome almost completely under-utilized in the clinic, measured yet unknown by the treating physician.
Ultimately, whole proteome and phosphoproteome measurements could replace the inferences made from genomic and transcriptomic analyses. Undoubtedly, the multiomics of the future will improve inputs and with it the capability of biosimulation to identify therapeutic imperatives for each cancer patient.
Another limitation to biosimulation is the perpetual incompleteness of biology, especially its complex regulatory processes, i.e., the regulome, which constitutes a majority of intronic genetic material, including promoter polymorphisms (averaging five per gene), thousands of micro-RNA and long-coding RNA, as well as line elements from retroviruses that were acquired over thousands of years that make up 8% of the human genome.
Nevertheless, the known world of cancer biology now represents an immense knowledge to inform our understanding of the functional organization of cancer. As such, biosimulation at its inception illuminates a formidable and challenging disease network and removes the molecular blindfold from treatment design.
Even today’s perpetually incomplete model has utility that far surpasses the era of “throwing it at the wall and seeing if it sticks,” i.e., the trial and error method, that characterizes the therapeutic approaches of the past and our contemporary guidelines.
Meeting the therapeutic imperative in oncology
Meeting the therapeutic imperative in oncology can be understood as the marriage of comprehensive multiomic sequencing, biosimulation, and combination drug treatment. This marriage rekindles the possibility of curing cancer by translating an engineering-level of understanding of disease mechanisms and vulnerabilities into network-level diagnosis that permits implementation of meaningful network targeting combination therapy.
By analogy to aerospace engineering, air travel provides a virtual guarantee at arriving at the destination. Similarly, biosimulation brings an engineering-level focus to why conventional therapies fail and what must be done to defeat the disease.
With it, we inaugurate a new mantra: sequence, biosimulate, treat. Eventually, practicing oncology without biosimulation will be the molecular equivalent of practicing surgery with a blindfold in place.
In recalling Arthur C. Clarks’ comment that “Any sufficiently advanced technology is indistinguishable from magic,” applied biosimulation can induce collapse of cancers that historically have been untreatable, marginally treatable, or treated by trial and error. It improves the value investment for society by avoiding therapies that have no chance of working and gives a boost to therapies that are usually effective for only a brief time by targeting the mediators of resistance. Inevitably, biosimulation is a tool that will help the drug development industry cut the cost of drug development and improve profitability that could eventually reduce the cost of drugs.
Selecting all the patients across the oncologic spectrum for whom a particular drug or treatment strategy is relevant, and excluding patients for whom there is no possibility of benefit is common sense—but the kind of sense that remains outside the scope of traditional clinical trial design and regulatory approvals which are still frozen by the cognitive bias of the pre-genomic era.
By contrast, the breakthrough in understanding the functional organization of complex cancers brings us a win-win proposition, promising to enhance survival and simultaneously improve economic efficiency of healthcare spending for society and industry.
Yet, biosimulation, like all medical innovations, now faces the burden of proof. However, patient-centric trials seldom receive the funding they need, especially for combinatorial therapies that involve agents from different pharmaceutical companies.
With the arrival of multiomics and the tools to make sense of the complexity of the individual’s cancer network, biosimulation-informed therapy selection represents a far more conservative, look-before-you-leap approach to disease management.
Additionally, skeptical regard of computer-assisted diagnosis is tantamount to questioning the value of sight over blindness. We cross our fingers and hope practice guidelines will perform perfectly for a particular patient, but uncertain if they will. Before the evidence base matures, biosimulation-assisted decision making can bring the latest insights of science to bear on crucial decisions in the clinic.
The absence of clinical evidence does not mean that there is no scientific basis, and often the underlying molecular biology has had solid preclinical consensus for many years. But clinicians still excuse themselves from mastering the mechanistic intricacies of diseases they treat, a standard of care countered by guidelines committees and regulators, and enforced by insurance companies that either intentionally or unwittingly seem to prefer the unsatisfactory status quo.
Many scientific insights remain unevaluable simply because they do not also bring a return on investment. Nevertheless, however disruptive or problematically inexpensive new technology may be, the status quo is too limited for all the stakeholders not to overcome the inertia of change and bring alignment to diverse incentives.
Eventually, personalized network targeting for cancer therapy will be as obvious as controlling blood pressure, blood sugar, lipids, and platelet stickiness in a patient with coronary disease, all essential ingredients of disease management, and incidentally, employing drug combinations that were never evaluated in phase I trials.
As in cardiology, biosimulation identifies the molecular network nodes that must be targeted before the cancer kills the patient, not merely identifying a new way to treat cancer but also a new therapeutic imperative.
Despite the sophistication in modern oncology, the practice of administering cellular poisons under the assumption of sameness was a radical undertaking at a time when the mechanisms of disease in an individual’s cancer were unknowable and would remain so for decades to come.
With the arrival of multiomics and the tools to make sense of the complexity of the individual’s cancer network, biosimulation-informed therapy selection represents a far more conservative, look-before-you-leap approach to disease management.
Biosimulation builds on the oncogene targeting approach by developing a strategy to treat tumor suppressor gene loss through synthetic lethality, while the network view of the master regulators illuminates cancer’s functional organization. With these, we glean a new perspective on why therapies work or fail for specific patients, while also creating a means to achieve collapse of the disease network.
For would-be early adopters and their patients, biosimulation aims to provide a parachute of clinical insight and a soft landing for patients. The alternative of jumping without a parachute or flying the plane without instruments has a predictable and still too often unsatisfactory outcome.
For all parties and especially the decision-makers who invest in clinical trials, skepticism about something new and unfamiliar could also be balanced by embracing a breakthrough technology that promises to transform our understanding of cancer and form the basis for the next generation of cancer therapies.