Examples of Engineering in the Cloud

Engineering teams share their success with CAE on the cloud via the UberCloud Experiment.

Team 169: Complex Blood Flow through Cardiovascular Medical Device Using OpenFOAM on Advania.


By Wolfgang Gentzsch and Burak Yenier

The UberCloud Experiment started in July 2012 to explore cloud computing for engineers and scientists, and to learn about cloud hurdles and how to remove them. Since then, the UberCloud Experiment has attracted thousands of organizations and 195 active teams in computational fluid dynamics (CFD), finite element analysis (FEA) and biology. Each year, numerous case studies are published in a Compendium to show how teams overcame roadblocks to engineering on the cloud.

Below are summaries of three of the 19 case studies in the 2016 UberCloud Compendium. This Compendium, the case studies and the corresponding engineering cloud experiments have been sponsored by Digital Engineering, Hewlett Packard Enterprise and Intel.

Team 169: Complex Blood Flow through Cardiovascular Medical Device Using OpenFOAM on Advania

This team consisted of the end-user Praveen Bhat, an Indian CAE consultant, and partners from HPE France, CFD Support in Prague, and Advania in Iceland. During this one week proof of concept, the team set up a technical computing environment on the Advania Cloud Platinum instances. OpenFOAM, the popular open source CFD toolkit, was used to simulate complex blood flow through a cardiovascular medical device. Pre-processing, simulation runs and post-processing steps were performed successfully with the OpenFOAM container coming with a fully equipped, powerful virtual desktop in the cloud and containing all the necessary software, data and tools.

Figure 1: Team 169 – Streamlines, representing the path the blood flows in the artery and at the inlet of the medical device, using ParaView running inside an UberCloud container on the Advania cloud. Figure 1: Team 169 – Streamlines, representing the path the blood flows in the artery and at the inlet of the medical device, using ParaView running inside an UberCloud container on the Advania cloud.

On a Platinum 3X Large instance, the SimpleFOAM solver ran on 16 cores in parallel for 1,000 time-steps of the cardiovascular device simulation in 30,000 seconds (roughly 8 hours). The total effort (without the eight-hour simulation run time) was six person hours to access Advania’s OpenCloud, get familiar with the environment, set up the OpenFOAM container, conduct testing, develop the medical application geometry and boundary conditions, start the jobs and do the post-processing with ParaView. It also included contacting and talking to Advania Support:

  • 2 hours setting up a test account, getting familiar with GUI, requesting increase in quotas
  • 1 hour setting up container environment, getting the base container, doing a quick test
  • 3 hours setting up medical device simulation, including steps like meshing, running the simulation (5 times), monitoring, opening tickets with support, etc.

Team 175: Parametric Radio Frequency Heating with COMSOL Multiphysics in the Cloud

This cloud experiment has been performed by the COMSOL team in Gottingen, Germany, to evaluate the performance of a COMSOL Multiphysics container in the CPU 24/7 cloud. For this project, the parametric radio frequency (RF) heating model from the COMSOL Multiphysics application library was used. It shows dielectric heating of an insulated block, caused by microwaves traveling in an H-bend waveguide. The solver has been parallelized so that several frequencies and geometric parameters can be computed at the same time. This model yields what is called an embarrassingly parallel computation, well suited for scaling in the cloud.

Figure 2:  Team 175 – Simulated H-Bend Waveguide. Left image shows the magnetic field (Arrows) of the waveguide and the temperature in the dielectric block (yellow). The image to the right shows the speed up gained when adding more compute nodes in the cloud. Figure 2:  Team 175 – Simulated H-Bend Waveguide. Left image shows the magnetic field (Arrows) of the waveguide and the temperature in the dielectric block (yellow). The image to the right shows the speed up gained when adding more compute nodes in the cloud.

Although the model is quite small and possible to compute on almost any modern computer, the amount of frequencies that need to be computed, together with the different geometric parameters, meant a high number of simulations would need to be performed. Even if the computation of one parameter only takes 30 minutes on a mid-sized workstation, the total computation times can become unacceptably large when the number of parametric values increases. The benchmarks were performed for one through 10 compute nodes (each equipped with dual Intel Xeon X5670 processors and 24GB of RAM). After the simulations finished, the computed results were collected and evaluated.

Team 187: CFD Analysis of a V6 intake manifold using STAR-CCM+ in the Cloud

The end user of this cloud experiment was Krishnan Nayanar, a project manager at CAE Technology Inc., supported by Siemens PLM Software providing STAR-CCM+ PoD tokens, and Microsoft Azure as the cloud provider. This case study is about the intake manifold of an automotive engine, which is a critical system as careful design of this component for pressure drop directly relates to the engine efficiency. In a V6 manifold, it is also important to achieve a uniform flow rate across each branch of the intake duct. The flow analysis in this case study using CFD simulation software provides a thorough understanding of these parameters and helps in designing efficient and economical manifolds.

The computations were performed on Azure A9 Instances, with a 10-node class “medium” cluster, where eight compute nodes were equipped with dual socket Intel Xeon CPU E5-2670 running at 2.6GHz and 112GB of RAM, giving a total count of 128 cores and 1TB of RAM. The nodes were connected with a 40Gbit/s InfiniBand network with remote direct memory access (RDMA) technology. The software used for the simulation was STAR-CCM+ V10.04. The simulation was continued until 10,000 iterations, and all the monitors stabilized.

Figure 3: Team 187 – CFD analysis of automotive V6 intake manifold using a software container with STAR-CCM+ in the Azure Cloud. Left: manifold geometry. Right: velocity streamlines from CFD simulation. Figure 3: Team 187 – CFD analysis of automotive V6 intake manifold using a software container with STAR-CCM+ in the Azure Cloud. Left: manifold geometry. Right: velocity streamlines from CFD simulation.

Total time taken on the 128-core Azure cluster to mesh 3.47 million cells with five boundary layers was 17.7 minutes, and total time to run 10,000 iterations was five hours and 20 minutes.

Wolfgang Gentzsch is president and co-founder of UberCloud. Burak Yenier is CEO and co-founder of UberCloud. For more information, visit ubercloud.com.

Share This Article

Subscribe to our FREE magazine, FREE email newsletters or both!

Join over 90,000 engineering professionals who get fresh engineering news as soon as it is published.


#16047