Formed in 2007, Black Swan Analysis is devoted exclusively to delivering high-quality data analysis, forecast modelling, and market research within the healthcare sector which generates insightful outputs that make business decisions clear for clients. Black Swan Analysis's understanding of patient epidemiology, pharmaceutical forecasting and wealth of experience on the client side has been a true differentiator and an asset to both the company and clients. It delivers the level of confidence to clients that ultimately translates into a quality output in every service aspect.
We spoke to Managing Director Christopher Ehinger about developments in oncology, challenges to PET Imaging and new advances in the treatment of debilitating diseases and disorders.
Q.1 What have been the biggest advancements in oncology over the past ten years? How have these impacted cancer treatment and what further advancements can we expect to see?
Treatments & Survival:
Oncology has witnessed something of a revolution over the past 15 years – starting with imatinib in the early 2000’s for treatment of CML - and it seems that the pace of progress has really accelerated in the last few years. There have been numerous new drug launches over the past 18 months such as the PD-1 inhibitors (Pembrolizumab from MSD and Nivolumab from BMS), numerous targeted tyrosine kinase inhibitors (ibrutinib from Pharmacyclics, nilotinib from GSK/Novartis to name but two) and a whole host of other targeted treatments including VEGF, EGFR and BRAF-targeted therapeutics.
With ASCO only weeks behind us, there was some astounding clinical data presented that suggests that this revolution is set to continue. There is currently much excitement over CAR-T-cell therapy whereby the patient’s immune system is focused on attacking their own tumours. And while some combinations have been hailed to ‘melt tumours’, there is still the issue of how all of these great things are going to reach the patient (i.e. the not insignificant issue of funding).
Of course new, more effective treatments with fewer side-effects are obviously a good thing for patients. Patients are surviving much longer; oncology is turning from an acute disease to a more chronic condition. This is starting to pose a serious problem for healthcare systems in terms of financing patients with increasingly expensive treatment options over longer time periods.
In the short term, cancer is set to move from acute to chronic with significant increases in patient survival. However, longer term, we might see cancer reverting to an acute condition in terms of duration of ‘treatment’.
Better disease definition and understanding:
As knowledge of risk factors, genetic markers and epigenetics deepens, we are understanding more and more about cancer and how and why it develops. For example, cessation of cigarette smoking in many countries (removing one of the single largest contributory environmental factors for lung cancer) has led to the gradual emergence of a new phenotype of lung cancer – a non-smoking related form of the cancer which is more predominant in females. Removal of known causative agents for particular cancers often reveals much about other sub-types of cancers hidden under the noise of the more prevalent types.
Once an understanding of cause has been established, it is possible to address that cause or to remove it entirely.
Prevention rather than Cure:
But it doesn’t stop at improvements in treatment; some cancers are being prevented altogether. When the cause has been identified, it may be possible to prevent the cancer from ever arising in the first place. For example we should start to see a radical decrease in rates of cervical cancer over the next 10-15 years as a direct result of HPV vaccination, removing susceptibility to strains of HPV responsible for ~ 70% of cases of cervical cancer. With better identification of risk factors and genetic markers, more prevention-led activities are bound to follow.
Which would then lead cancer back to being an acute condition in terms of treatment rather than chronic – cancer could eventually follow the form of a vaccine market rather than a traditional treatment market.
The importance of patho epidemiology:
Since the market is highly dynamic and becoming more targeted, a solid understanding of the patient populations and subtypes within each is becoming more paramount.
Imagine the emergence of a new biomarker that identifies a group of patients that would benefit from a new drug. Knowing the proportion of patients within the overall disease population that express that marker would aid greatly when forecasting product sales and volumes.
Similarly with regard to histopathology – is a drug particularly effective for tumours arising from a certain cell type?
In the case of cancer prevention, understanding which groups could disappear from the mix could be advantageous when developing new drug targets, pricing new therapies or running clinical studies.
Traditional epidemiology can take us so far, but to go the extra mile requires an understanding of the pathology of the disease as well as its incidence to provide a full picture to fit the needs of an evolving market.
Q2. How will epidemiologic investigations develop during the next five years? Do you expect to see the integration of new technology (VR, Google Glass etc.) into this process? What benefit does this technology offer?
More exciting in terms of epidemiology is the emergence of so-called “telehealth” / remote patient monitoring and apps / devices that track patient biometrics over extended periods of time. Such data could provide valuable insights into the natural history and progression of a disease plus more insight into factors affecting drug efficacy such as compliance and effect of medication to prevent disease progression, treat symptoms or even monitoring of adverse events. Data of this type could be highly beneficial when estimating cost-effectiveness of new drugs (one criticism levelled at past HTA submissions is the limited predictive ability of the available data used in HTAs to estimate the true impact of a new medication).
And additionally, closer monitoring of patients both pre and post development of a disease with subsequent data mining of the information could generate new hypotheses for risk factors for disease development, first symptoms etc which could in turn lead to better, quicker diagnosis or even prevention of the disease if risk factors can be properly identified.
This leads us to the inevitable term of “big data”. So far, while everyone has hailed ‘big data’ as a ‘good thing’, not much has come from it. There is now an unprecedented amount of information available to us. On a daily basis we are bombarded with information, but not all of it is useful or accurate. Is this the case with ‘big data’?
Many lack the skills to filter such large volumes of information to find the truly valuable, meaningful (needles and haystacks spring to mind). It is more important than ever to be able to find and extract the information that you need rather than the hundreds of pieces of mis-information that are readily available.
Recently, while undertaking a piece of epidemiological research into a rare ‘disease’ (it’s not really a disease per se, but a condition that can arise as the result of many other diseases), I came across several pieces of seemingly disparate, contradictory information regarding frequency of this condition within one reasonably homogeneous population. Closer inspection of the populations reported with this condition revealed that within one population, the sample was males with high grade disease – not that relevant to the overall population required, especially since rates of this condition occur more frequently in high grade disease. However, this was the most highly referenced study within this population group with many papers reporting this study’s frequency as that of the overall population.
With so much more information available, while more will be able to be done in terms of identifying patients, disease trends and causation, it will become harder and a more skilled job to identify and apply the appropriate information in the appropriate way.
When we undertake an epidemiological review, we find as much relevant information as possible and understand where and why papers differ in their reported rates or other information. This is why our biographies are quite so lengthy – we have tried to turn over every stone to find the truth about the disease we are quantifying.
Q3. What current challenges does Positron Emission Tomography (PET) Imaging face? How are these challenges being dealt with? What diagnostic opportunities does PET Imaging offer?
Christopher: The growth of the PET diagnostic imaging modality over the past 20 years has been based primarily on its use in oncology, utilising the 18F-FDG radioisotope to deliver patients a differential diagnosis and tumour staging information for various types of cancer. Any future growth in this modality will be limited predominantly by the future capacity of the PET/CT scanner infrastructure which is currently close to full capacity. Advances in the CT scanner hardware component to provide quicker scanner times, improving the workflow and the number of scans that could be potentially completed in a day will provide some incremental growth in the number of scans.
The growing utility of PET/MRI technology, developed by the major equipment providers, could enable the use of diagnostic tracers at a molecular level not possible to visualise with the current PET/CT scanner technology. This could deliver a higher rate of accuracy at a field of view 10 times smaller than that currently available on the market. These attributes will enable the commercial potential for new F-labelled tracers for neurological imaging, such as Alzheimer’s disease, and oncology markers that closely track the molecular changes during treatment that can deliver better outcome for patients.
Several new therapeutics require a positive diagnostic test involving a PET scan (e.g. Alzheimer’s with companion PET imaging agents to show beta amyloid plaque burden) before treatment can be initiated or reimbursed. Companies bringing products such as these to market will need to understand the constraints that could potentially limit uptake of their product as a direct result of resource limitations of the current PET scanner infrastructure and capacity.
Q4. What debilitating diseases and disorders do you expect to be manageable in the near future? What will need to be done to achieve this?
Christopher: The most debilitating diseases are those for which there is no effective treatment and outcomes for patients are poor. Many of these come under the category of rare or orphan diseases. According to the NIH (National Institute of Health) there are over 7,000 rare diseases affecting more close to 10% of the world’s population.
While this is huge area of unmet need, thankfully some of the higher profile rare diseases are attracting attention with the largest areas of research focusing on cancers and genetic disorders (included inherited conditions). For patients with rare diseases and short life expectancy (e.g. Wegener’s granulomatosis, pulmonary arterial hypertension), we can expect an extension to life expectancy (as per the revolution we are seeing in oncology). In the case of Wegener’s granulomatosis, treatment with rituximab has led to a doubling or even three-fold increase in patient life-expectancy according to data from the Norfolk Arthritis Registry.
According to the FDA, one third of all new drug approvals in the last 5 years were for drugs to treat rare diseases. In recent years there are a few ground-breaking drugs
Some of the more popular rare diseases being pursued are:
- Prader-Willi syndrome
- Duchenne Muscular dystrophy
- Gaucher’s disease
- Von Willebrand disease
- Fabry disease
- Epidermolysis bullosa
- Niemann-Pick disease, type C
- Amyotrophic lateral sclerosis (Lou Gehrig’s disease)
- Huntington’s disease
All of these conditions are highly debilitating, with significant loss of quality of life, lack of independence and shortened life expectancy.
Of course, rare diseases are attractive areas of research for several reasons: comparator therapies are lacking, opportunity for demonstrating patient benefit is high, rare or orphan disease targets can be given Fast-track status by regulatory bodies such as the FDA and EMA. Because of the potential to show significant outcomes for patients, this plays well to health technology assessments to demonstrate overall value and utility.
This speaks mostly around therapeutic developments. However, there is increasing investment in stem cell therapies. This area, while not on the immediate horizon, could be revolutionary for treating patients with conditions such as spinal cord injury, traumatic brain injury and other conditions where the body’s inability to repair itself has led to loss of function.
Understanding populations affected by these conditions posed a number of challenges in and of itself. Data regarding population prevalence is scarce to absent in the majority of cases, making valuations of assets to treat these conditions nearly impossible. And even when information can be found, sometimes it does not make sense or is contradictory. However, estimates can be modelled from the data available coupled with information regarding the natural history and progression of the disease. For example, Huntington’s disease – there are reported birth prevalence rates for the genetic marker which leads to development of HD. However, symptoms can develop at any point throughout the patient’s life, with the earlier emergence of symptoms leading to significantly shorter life expectancy.
Q.5 Outside of those you already cover, what other segments within the Healthcare Industry do you find interesting? Have you ever considered covering these segments?
Christopher: An area of much interest, which is an adjacency to our disease epidemiology databases, would be the assessment of the cost impact of healthcare treatments and innovation to the healthcare system. These assessment would make up the bulk of the current market information used in a Healthcare Technology Assessment (HTA).
Healthcare costs have been more of a challenge in recent year with the approval of new novel treatments to address identified clinical unmet needs, in some cases for a small sub-segment of the population with treatment cost that many healthcare systems find it a struggle to justify their payment. Many of these healthcare systems, such as the NHS, require evidence that the new innovation delivers a positive cost impact versus the current standard of care which can be a challenge for expensive treatments addressing only a small patient population especially in rare and orphan diseases.
We expect this to be an area of growth for all the major global markets requiring a continuous source of patient work flow costs, patient epidemiology, and drug pricing data to convey an accurate standard of care where these new technologies will be able to be compared. This could potentially be done by disease area, split by a catalogue of current procedure and treatment algorithms. All the information would be sourced so the customer would have a validated starting point of comparison of their product to those currently being used in a specific market.
As we can see from Christopher's answers, new techniques and therapies in oncology has resulted in more effective treatments with fewer side-effects, an obvious win win situation for patients. Telehealth technology will enable greater insights into the history and progression of diseases, and pharma companies have been focusing heavily on the treatment of rare disease. All in all, it looks like the healthcare sector is constantly evolving, resulting in better treatments, therapies and medicines for all.
We'd like to thank Christopher for taking the time to answer our questions, and for providing us with such comprehensive responses.
About the Analyst:
Christopher Ehinger has over 18 years of pharmaceutical experience in various commercial roles in therapeutic, diagnostic and medical device businesses. This includes experience in blue-chip organizations such as SmithKline Beecham, GlaxoSmithKline, Amersham Health and GE Healthcare where he was the Global Marketing Director for the Oncology and Neurology disease franchises.
He is currently the Managing Director for Black Swan Analysis, a unique fully integrated analysis practice established in 2007. The company provides forecasting modelling services and develops novel patient related healthcare databases which are made available via the web to a global client base of pharmaceutical and healthcare professionals on a subscription basis.
Christopher’s hospital and specialist care market experience includes launching 5 new chemical entity (NCE) products and 3 new license indications for previously approved prescription products for the USA and EU markets. Chris brings a wealth of experience in due diligence & NPV valuations for in-license opportunities in Neurology & Oncology therapy areas to the Black Swan Analysis Team. Chris holds an MBA from London Business School.
Stay up-to-date with the latest trending news stories and industry advances with the Research and Markets blog. Don’t forget to join our mailing list to receive alerts for the latest blog plus information about new products.