The G8: when dementia care got personal (well, molecular actually)

Big Data picture

At best, the donation of patients’ DNA for free globally in #G8dementia to enhance Pharma shareholder dividend can be sold as ‘coproduction’. It’s easy to underestimate, though, the significance of the G8 summit. It was overwhelming about the ‘magic bullet’, not the complexities of care. It made great promotional copy though for some.

It was not as such health ministers from the world’s most powerful countries coming together to talk about dementia. It was a targeted strike designed to decrease the democratic deficit which could arise between Big Pharma and the public.

Here it’s important to remember what #G8dementia was not about. It was not about what is a safe level of health and social dementia care is around the world. It had a specific aim of introducing the need for a global collaboration in big data and personalised medicine. Researchers whose funding depends on the wealth of Big Pharma also were needed to sing from the same hymn sheet.

For such a cultural change to take effect into this line of thinking, a high profile publicity stunt was needed. Certain politicians and certain charities were clearly big winners. However, with this, it was deemed necessary from somewhere to introduce an element of ‘crisis’ and ‘panic’, hence the terrifying headlines which served only to introduce a further layer of stigma into the dementia debate.

And yet it is crucial to remember what was actually discussed in #G8dementia.

In a way, the big data and personalised medicine agenda represents the molecular version of ‘person-centred care’, and these are academic and practitioners “circles to be squared”, or whatever.

Big data and personalised medicine have been corporate buzz terms for quite some time, but while it’s widely known there are correlations between the two, many are still struggling with how to effectively leverage mass amounts of data in order to improve efficiencies, reduce costs, and advance patient-centric treatments.

Medicine’s new mantra is “the right drug for the right patient at the right time.”  In other words, medical treatments are gradually shifting from a “one size fits all” approach to a more personalized one, so that patients can be matched to the best therapy based on their genetic makeup and other predictive factors.  This enables doctors to avoid prescribing a medication that is unlikely to be effective or that might cause serious side effects in certain patients.

Personalised drug therapy in its most sophisticated form uses biological indicators, or “biomarkers” – such as variants of DNA sequences, the levels of certain enzymes, or the presence or absence of drug receptors – as an indicator of how patients should be treated and to estimate the likelihood that the intervention will be effective or elicit dangerous side effects. In the case of Alzheimer’s disease, the hunt for a marker in the ‘brain fluid’ (cerebrospinal fluid) has been quite unimpressive. The hunt for those subtle changes in volumes or abnormal protein levels has not been that great. The information about DNA sequences in Alzheimer’s Disease (more correctly a syndrome) is confusing, to say the least. And there at least 100 different types of dementia apart from Alzheimer’s Disease (making the quest for a single cure for dementia even more banal, but a great soundbite for politicians who won’t be in office long anyway.)

With healthcare costs in the U.S. increasing steadily over the last 20 years to 17% of GDP, and similar scaremongering about ‘sustainability’ from economically illiterate people on this side of the Atlantic too, overall moronic healthcare “experts” are looking for every path possible for “efficiency”, “productivity” and “reform”. Many believe that a long-term source of savings could be the use of big data in healthcare; in fact, the McKinsey Global Institute estimates that applying big data strategies to better inform decision making in U.S. healthcare could generate up to $100 billion in value annually.

Significant advancements in personalised medicine, which includes genomics, is making it easier for practitioners to tailor medical treatments and preventive strategies to the characteristics of each patient — advancements that supporters say will improve care and reduce costs. Private markets have long capitalised on fear, and dementia represents a nirvana for private healthcare. It is potentially a huge ‘market’ for drugs. Yet progress is being slowed by a number of factors, including the limited sharing of patient information. This is why there was so much shouting about the need for relaxed regulation at #G8dementia. And yet ultimately, these stakeholders, important though they are, know they can go nowhere without the license from the public. Patient groups and charities represent ‘farms’ for such projects in medicine, as they do for law firms.

Greater sharing, it is argued, would allow medical institutions that are creating patient databases — some with genomic information — to expand the size of the patient pool, thus making it more likely to identify and treat rare conditions. Such discussions necessarily avoid the contentious issue of who actually owns personal DNA information. What’s more important? That patient’s privacy, or the public interest?  Data sharing, it is argued, would also allow patients to personally store and share their data with different practitioners. The day that everyone will have every detail about their personal health on their smartphones isn’t that far off, some hope.

The other component of the data-accessibility issue is how medical researchers should go about building massive databases of patient records. The ultimate application is a big-data program that could analyse a patient’s data against similar patients and generate a course of action for the physician. This is why the #G8dementia want to get seriously ‘global’ about this project.

Data can help practitioners diagnose patients more accurately and quickly, and identify risk factors much earlier.

Edward Abrahams, president of the Personalized Medicine Coalition, has said,

“The tricky part is that the public wants control over information, but as patients they may think differently”.

The creation of this value lies in collecting, combining, and analysing clinical data, claims data, and pharmaceutical R&D data to be able to assess and predict the most efficacious treatment for an individual patient.  This might be possible through ‘big data’ and ‘personalised medicine’ in a number of key areas.

Clinical trials are of course necessary for every drug to get to market, and the gold standard is currently a randomised clinical trial backed up by a published paper. Big data approaches are complementary to traditional clinical trials because they provide the ability to analyse population variability and to conduct analytics in real time. I

Secondly, the ability to manage, integrate, and link data across R&D stages in pharma might enable comprehensive data search and mining that identify better leads, related applications, and potential safety issues. The sequence alone is much more useful as it is correlated with phenotypes and other types of data. This has naturally affected the way companies think about data storage and structure, with cloud solutions becoming more popular. The two leaders in next-gen sequencing technologies, Illumina now offers cloud solutions for data storage and analysis to meet this growing need. Hence it was ‘name checked’ in #G8dementia.

Thirdly once R&D data and clinical trial data is indexed for big data analysis, the third piece of the big data puzzle into routine clinical practice. Ultimately personalised medicine is about this correlation of diagnostics and outcomes, but tailored to each and every patient.

While big data has already been used successfully in consumer markets, challenges remain to its implementation in healthcare. The primary challenge in moving to big data approaches is simply the vast amount of data in existing systems that currently don’t “talk” to one another and have data that exists in different file types. Hence there is considerable talk about ‘harmonisation’ of data at the #G8dementia conference. The second challenge for data in the clinical space is how to store and share these large amounts of data while maintaining standards for patient privacy.  Achieving better outcomes at lower costs (aka ‘doing more for less’) has become the exhausted strapline for the NHS recently, and big data  may seem particular attractive to NHS England in their thirst for ‘efficiency savings’.

However, bridging the “democratic deficit” remains THE fundamental problem. If you though the #G8dementia was like an international corporate trade fair, you may not have been invited to other similar events.

The 16th European Health Forum brought together 550 delegates from 45 countries, to take the pulse of Europe’s healthcare systems five years after the 2008 financial crisis and consider what needs to be done now to build, ‘Resilient and Innovative Health Systems for Europe. The Big Data workshop was organised by EAPM and sponsored by EFPIA, Pfizer, IBM, Vital Transformation, and the Lithuanian Health Forum.

There, key issues to do with ownership, security and trust must be addressed, believed Amelia Andersdotter, MEP: “We have some serious challenges for politicians and industry to preserve citizens’ confidence.”

Vivienne Parry in #G8dementia had wanted to talk about ‘safe’ data not ‘open’ data. And where did this idea come from?

Ernst Hafen, Institute of Molecular Systems Biology, ETH Zurich, has said, “We all have the same amount of health data.” Applying big data to personalised medicine, “Only works if we are comfortable with [our data] being used. We have to provide a safe and secure place to store it, like a bank,” also accoding Hafen. Tim Kelsey used exactly the same language of banks at a recent event on innovations in London Olympia.

If you think you’ve had enough of PFI, you ain’t seen nothing yet. Public-private partnerships open the way for health data to be shared, and so improve research and translation, according Barbara Kerstiens, Head of Public Sector Health, DG Research, European Commission. The aim was, “To get stakeholders working together on data-sharing and access, and ensure there is a participant-centred approach,” she said.

And how will the law react? Case law is an important means by which we know what is patentable at the European Patent Office (EPO). However, sometimes the EPO’s view of what is patentable in an area changes before the case law does. This can sometimes be detected when Examiners start raising objections they would not have previously done. Meetings between the EPO and the epi (the professional institute for EPO attorneys) are very useful forums for obtaining ‘inside information’ about the EPO’s thinking which is not yet apparent from the case law. The June 2012 issue of epi Information provides a report of such a meeting held on 10 November 2011 between the EPO and the biotech committee of the epi.

Discussion item 8 was reported as follows:

‘8. Inventions in the area of pharmacogenomics
This concerns cases which are based on a genetic marker to treat a disease, for example methylation profiles. It can involve a new patient group defined by an SNP. The EPO said that often the claims can lack novelty, as one patient will have inevitably been treated with the SNP, even if the art does not explicitly say so.’

The EPO’s comments seem to indicate that it is about to change the way it assesses novelty when looking at medical use claims that refer to treatment of a specific patient group.

A “SNP” is a form of genetic marker which varies between individuals. The idea behind the relatively new field of pharmacogenomics is that, if you know which SNP variants a patient possesses, you can personalise the drugs given to a patient in accordance with his genetic makeup. It is now recognised that the genetic makeup of an individual can be very influential as to whether he responds to a drug, and so one application of pharmacogenomics is to only give those drugs to patients who will respond to them.

Presently, suitable biomarkers for personalised medicine are proving difficult to find. So it seems that the sector is going to require a lot of investment. There’s where #G8dementia came in handy. But in investors in biotech do like to see that strong patent protection is available in the relevant sector; hence the upbeat rosy approach from the speaker from JP Morgan at #G8dementia who framed the debate in terms of risks and returns.

Personalised medicines, and in fact diagnostics in general, has been thrown into uncertainty in the US after the Supreme Court’s decision in Mayo v Prometheus which found that a claim referring to steps that determined the level of a drug in a patient was directed to a law of nature and was thus not patentable. It would be unfortunate for personalised medicines to be dealt a further blow by the EPO, making the test for novelty stricter in this area.

So there may be trouble ahead.

The #G8dementia was merely the big players, with the help of this Pharma-friendly community in the UK, dipping their toe in the water. It was really nothing to do with frontline health and social care, and any mention of them was really to make the business case look relevant to society at large.

It was for academics interesting in that it was when person-centred care came ‘up front and personal’. Molecular, really.