April 3, 2018
Organizers of the ISC High Performance conference announced that physicist and CTO of CERN openlab, Maria Girone, will be speaking at the 2018 conference as the main conference keynote speaker on Monday, June 25. The conference runs from June 24 to June 28 in Frankfurt, Germany. Organizers expect over 3,500 HPC community members at the Messe Frankfurt for the 2018 conference, which targets researchers, scientists, business leaders and academicians.
In her keynote talk, Girone will be mainly discussing the demands of capturing, storing and processing the large volumes of data generated by the Large Hadron Collider experiments.
The LHC is a powerful particle accelerator that is a large, complicated machine, according to the ISC. The LHC collides proton pairs 40 million times every second in each of four interaction points, where four particle detectors are hosted. This high rate of collisions makes it possible to identify rare phenomenon and aids in helping physicists reach the requisite level of statistical certainty to declare new discoveries, such as the Higgs boson in 2012. Extracting a signal from this huge background of collisions is one of the most significant challenges faced by the high-energy physics (HEP) community.
The HEP community has been a driver in processing massive scientific datasets and in managing the largest scale high-throughput computing centers, according to ISC. Together with many industry leaders in a range of technologies, including processing, storage, and networking, HEP researchers have developed a scientific computing grid: a collaboration of more than 170 computing centers in 42 countries, spread across five continents. Today, the Worldwide LHC Computing Grid regularly operates 750 thousand processor cores and nearly half of an exabyte of disk storage.
CERN openlab is a public-private partnership between The European Organization for Nuclear Research (CERN) and some global information and communications technology companies. It plays a key role in helping CERN address the computing and storage challenges related to the LHC upgrade program.
Computing and storage demands will become even more pressing when CERN launches the next-generation “High-Luminosity” LHC in 2026. At that point, the total computing capacity required by the experiments is projected to be 50 to 100 times greater than today, with storage needs expected to be on the order of exabytes.
Even assuming expected improvements on IT technologies, and given the realities of a constant budget, the current approach to data processing will not be sustainable. This is why an intense R&D program is ongoing to explore alternative approaches to the High Luminosity LHC big data problem.
“I will discuss some of the approaches we are considering to grapple with these enormous data requirements, including deploying resources through using commercial clouds, and employing new techniques, such as alternative computingarchitectures, advanced data analytics and deep learning,” explains Girone. “Finally, I will present some medical applications resulting from the research at CERN.”
One area of medicine that can utilize CERN’s technologies and expertise is hadron therapy, a rapidly developing technique for tumor treatment. The next step in radiation therapy is the use of carbon and other ions. In 2020 there will be around 100 centers around the world offering hadron therapy, and at least 30 will be located in Europe.
Sources: Press materials received from the company.
About the Author
DE’s editors contribute news and new product announcements to Digital Engineering. Press releases can be sent to them via [email protected].Follow DE