DE · Topics · Simulate · CFD

Are You Ready for the Petabyte Revolution?

By Steve Robbins

 

Robbins

Sometimes it seems like I spend all day on the phone. This makes sense because DE is a nexus for companies that develop engineering products and services and the engineers who want to learn about new ideas and products that will help them in their jobs.

Through all this information and noise, every once in a while something clicks and an aha moment—an epiphany, really—occurs.

I was recently on a call set up by ANSYS, listening to Professor Piero Dellino of the University of Bari, Italy, explain how he had used ANSYS FLUENT to simulate volcanic eruptions and the particle clouds they create. Through experimentation he has defined what volcanic ash is, how it moves, and the particle size and concentration of an ash eruption. He said that most volcanic material is glass and some of it, which can remain suspended in the atmosphere for a long time, is very fine and shaped irregularly. ANSYS software can simulate the gas and particulate matter from an eruption as well as the distribution of the ash cloud using discrete phase modeling, resulting—as you might have guessed—in a very large data set.

On to the next call: Microsoft’s Bill Hilf, general manager of Platform Strategy, briefed members of our editorial staff on Microsoft’s Technical Computing initiative. I had just reviewed, “Modeling the World,” on Microsoft’s website.

Microsoft is leading this initiative to empower engineers and scientists with the tools to create large-scale computer models that can create simulations of complex systems. Changes in computer technologies will drive this revolution. Our conversation focused around technical computing on the cloud, simplifying parallel development, and the tools and applications that will be enabled from these changes. Engineers in the near future could work together to build common platforms that would scale to the resources needed, including very large-scale multicore clusters that are not available today. The power to tackle the biggest problems, analyzing data without preconceived models, and finding patterns that create the solutions would become available to researchers from companies or academia, large or small. As Bill explained Microsoft’s vision of this initiative, the proverbial light bulb went off.

Scientists and engineers are creating massive amounts of data, some intentionally, and some through experiments, tests, and sensors. Data is being recorded from almost every aircraft in service today, from satellites, environmental monitors, from sources we don’t even think about. Machines are creating data just for other machines. If scientists, engineers, or even financial analysts have access to very large amounts of data and the computer power to let statistical algorithms find patterns, we could change the way we model our ideas and the way the world behaves.

I wondered if Professor Dellino could improve the chances for predictive behavior of volcanic ash clouds if he incorporated real time-weather data. If the data from his model of eruption particle behavior was public, it isn’t a stretch that aerospace engineers around the world could use it to design aircraft engines to withstand the effects of the glass particles in volcanic ash or predict what concentrations of ash are safe for the engine to fly through.

Can engineers and scientists make data public in today’s world, with competitive pressures and intellectual property concerns? How does the technical community address the issues of sharing data? Would your client or company allow test results or analysis data to be made public? Are you ready for the petrabyte revolution? Let me know what you think.

 


Steve Robbins is the CEO of Level 5 Communications and executive editor of DE. Send comments about this subject to [email protected].

Share This Article

Subscribe to our FREE magazine, FREE email newsletters or both!

Join over 90,000 engineering professionals who get fresh engineering news as soon as it is published.


#5291