Looking to the Future of Comprehensive Precision Medicine
Precision medicine has gone from a focus primarily on genomics to a larger lens across a broader spectrum of data about us as humans.
Latest in High–performance Computing HPC
High–performance Computing HPC Resources
November 9, 2018
Spirometers, in their various forms, have been used since 1846 to measure patients’ lung capacity. Initial versions were taller than many patients, and worked by measuring the displacement of water. Today, digital spirometers fit in the palm of your hand, link wirelessly to an app in patients’ smart phones, providing physicians with continuous real-time measurements.
Following lung surgery, a patient leaving the hospital might be asked to blow into the device every hour or two, enabling physicians to immediately identify problems before they become serious health threats. Now, imagine thousands of patients using such a device, and physicians linking the measurements to their genomic profiles, then aggregating the information to discover patterns. In the future, anesthesiologists might use this information to decide ahead of time which drugs would be best for each individual surgical patient.
TGen’s Work with City of Hope
TGen is working with the Innovation Lab at City of Hope, our southern California affiliate, on just such a project. Spurred at first by the needs of patients with cystic fibrosis, the wirelessly linked portable spirometer will eventually be a technology that moves into the common care stream for patients.
This is just one example of how precision medicine is moving beyond DNA sequencing and into a world of quantified medicine, where tools like AI and machine learning are required to cope with the growth of biomedical digital data.
Precision medicine has gone from a focus primarily on genomics to a larger lens across a broader spectrum of data about us as humans. Data from our phones, Fitbits and other devices are being used more and more in clinical settings. Elite athletes, for example, employ biomechanics to improve techniques and performance to gain even the tiniest advantage that could mean the difference between victory and defeat.
It’s the quantification of ourselves that we’re moving toward; the ability to measure functions. In healthcare, this could mean saving time, money, and access to more immediate health benefits.
Computing Needs of the Future
As we merge these different data sources to create a more complete picture of health, it becomes necessary to have vast amounts of computational storage and network resources to process it.
At TGen, we are continuously upgrading our high-performance computing (HPC) capabilities to anticipate our researchers’ supercomputing needs for data collection, aggregation, analysis, networking, management and storage.
We once boasted two of the world’s TOP500 supercomputers. Of course, we now are dwarfed by some of today’s supercomputers represented at the SC18 conference.
In the first few years of TGen, all of the data that was generated — 1 terabyte, or 1 trillion bytes — now can easily fit on one USB hard drive. The advent of next generation sequencing in 2006 dramatically changed the volume of data being generated; and it isn’t slowing down. Computation and storage capabilities must keep pace. They have to be faster, cheaper, and better.
TGen’s Partnership with Dell EMC
Today, in collaboration with Dell EMC, the storage capacity of the entire TGen system is more than 3 petabytes (3,000 terabytes) with over 3,000 processing cores, making it one of the most powerful, purpose-built supercomputers specifically for processing genomic information.
For TGen’s HPC infrastructure, we leverage one of the industry’s state-of-the-art scale-out storage systems — the Dell EMC Isilon platform — to store, manage and protect data with efficiency and massive scalability. This system saves time. Our researchers no longer have to shepherd data among different computing systems. Data from our genomic sequencers now goes directly to our HPC center.
More than anything, this is a boon for our patients, many of whom are suffering from aggressive cancers. They can’t afford to wait for new precision medicine-based treatments.
On the horizon is exascale and beyond. I believe that with the increasing amount of data being collected and stored about each and every one of us, we will need ever-increasing levels of computational capability. This will be essential as we move toward a greater understanding our entire selves; as we move from a starting point of disease and our search for treatments and cures, to a starting point of health and maintaining and enriching the health of each individual.
— James Lowey is Chief Information Officer, TGen.org