-->

Randy Bean: Big Data In The Time Of Coronavirus (COVID-19)

COVID-19 has arrived with consequences that are grave and unsettling. We are seeing daily reports showing outbreak and fatality curves on a worldwide and country/city basis. Big Data lies at the heart of efforts to comprehend and forecast the impact that Coronavirus will have on all of us.

To better understand how Big Data is being employed to forecast and understand the reach and impact of Coronavirus, I spoke with Sy Pretorius, MD, who serves as Executive Vice President and Chief Medical & Scientific Officer for Parexel, a global provider of biopharmaceutical services that help transform scientific discoveries into new treatments. Parexel is one of the largest clinical research organizations in the world.

I spoke with Dr. Pretorius about how Big Data can help our understanding of COVID-19:

How are Big Data sets of massive amounts of epidemiological and scientific data enabling health workers, scientists, epidemiologists, and policy makers make more informed decisions in fighting coronavirus?

The near real-time COVID-19 trackers that continuously pull data from sources around the world are helping healthcare workers, scientists, epidemiologists and policymakers aggregate and synthesize incident data on a global basis.

There has been some interesting data resulting from GPS analyses of population movement by region, city, etc., which ultimately helps provide a view of the population’s compliance — or lack of compliance — with social-distancing mandates.

Additionally, at least one digital thermometer technology has been tracking temperatures recorded by their devices, showing that average temperatures have declined in some locations following the enactment of social distancing guidance.

How have advances in Big Data management and data preparation made it possible to source, aggregate, and synthesize incident data on a global basis?

There are many opportunities to make the use of Big Data more impactful in situations like these as a society and as an industry, though we have not yet been able to effectively leverage the power of Big Data in search of a cure.

Ideas such as creating large scale COVID-19 Real World Evidence (RWE) studies that pull data from a variety of real-world sources — including patients now be treated in the hospital setting — could help accelerate the development of treatments in a more patient-centric and patient-friendly way.

We are just starting to see movement among advanced data aggregation service companies and virtual studies platforms in serving the life sciences sector. The most common aim is to connect assay results to clinical status in near real time.

Assuming that curation timelines can be meaningfully overcome, these solutions could create analyzable patient data sets that could be interrogated with both traditional approaches and AI based pattern detection to better understand the disease. Population level views allow epidemiologists, biostatisticians and clinicians to explore the relative effectiveness of variations in local treatment protocols.

The COVID-19 pandemic could be viewed as a call to action to determine how access to data could be improved. We at Parexel are actively and urgently working to form a coalition with other drug development experts and industry associations to support such an effort here in the United States, but we need to move quickly.

How is Big Data making it possible to fight global pandemics on a more data-driven and preemptive basis?

While we are seeing greater advancements with Big Data, as both a society and an industry, we still have steps to take to effectively leverage the power of Big Data in search of a cure for COVID-19. Advanced analytics and signal detection within health care systems is one of several large automation improvements that will help surface signs of a start of a pandemic. More integration within a global system needs to continue.

Models from Big Data supported the prediction of a pandemic. Our ability to contain and respond with treatment is dependent upon early detection and our ability to leverage as much data as can be aggregated on infected individuals. This will help us understand differences in presentation, response to the various treatment modalities, and codification of those treatments that are having better impact.

How is Big Data contributing to the efficiency and effectiveness of clinical trials?


Clinical trial operations are benefiting most from the ability of data to connect us more readily with patients during this time. The result is faster and more targeted site and patient recruitment, while ensuring patient access to these important studies.

We at Parexel are exploring ways to create large-scale COVID-19 RWE studies that can pull data from what we are observing with patients currently under treatment. Some of the potential benefits include a more patient-centric and patient-friendly focus on identifying and developing potential cures.

The key benefit is an opportunity for faster and near real-time evaluation of decision-making based on the data. This will save lives and result in identifying effective therapies faster.

Another idea is to identify potential patients at the point of testing and then rapidly populate patient histories and track progress by linking to various real-world sources via tokenization with limited burden on the patient and the site.

Once healthcare data can be used directly (e.g., synthetic control arms) to build the clinical data set for safety and efficacy analysis, fewer patients will need to be enrolled and randomized, reducing the total duration of the study and the logistical burden on patients.

What lessons can be learned from the ability to readily source and organize Big Data sets on a global basis for other global health care challenges?

There are massive efforts being deployed to leverage AI and machine learning on the datasets that are currently available. We anticipate that the value of the currently available datasets will increase significantly as COVID-19 testing becomes more readily available. Progressing ‘Clinical Research as Care’ from theory to practice will be key.

How is public health and epidemiology benefiting from Big Data capabilities relative to data challenges of the past? Are we getting data faster, better, more accurately?
We are getting better data faster, and more accurately. There are numerous data sets in aggregation that are providing us both broader and deeper views of our population health, healthcare utilization, and our mobility and lifestyle patterns. This information is critical as we assess the potential incidence and behavior of a condition.

How can Big Data enable the world to plan and prepare for further epidemiological outbreaks?

The data sets from the COVID-19 pandemic will likely form part of the evidence package that will be presented to regulatory authorities once a therapy or therapies have been identified that appear to be effective. This will potentially set a precedent for how data can be used in similar situations in the future.

What we learn from approaches like synthetic control in the absence of randomized control populations will be effective in mitigating challenges for future epidemiological outbreaks.

(Forbes)

No comments:

Post a Comment