Moore’s Law for Finite Elements

This theory sets the standard for ever-increasing FEA model elements.

This theory sets the standard for ever-increasing FEA model elements.

Tony AbbeyMost people will be familiar with Moore’s Law, which wasn’t so much a law, but a 1965 prediction by Intel’s Gordon Moore that the number of transistors on a microprocessor chip would roughly double every two years. Currently, the density of components on a silicon wafer is close to reaching its physical limit. But don’t worry: There are promising technologies that should supersede transistors to overcome this. Demand from consumers will ensure that this evolution continues.

In the world of finite element analysis (FEA), the pursuit of Moore’s Law has meant that we can build more complex models thanks to ever-increasing, affordable computing power. Added to this is the dramatic increase in software program efficiency over the same period.

Does Moore’s Law Apply to FEA Models?

I was researching Moore’s Law for a training class and I wondered if it might match increasing FEA model size over the years. My first serious use of FEA in industry was in 1976. The preprocessing consisted of sketching a mesh overlay directly onto the drawing board. A typical aircraft lug model took 300 brick elements, created by sketching a 2D mesh and then extruding through depth.

The next waypoint is 1985. I have some fragile plots of a satellite bracket meshed with 2000 3D elements. An old report shows a location fixture from 1993 with around 10,000 elements. An archived model of a similar fitting from 2007 uses 400,000 elements. This supported local plastic behavior and fracture mechanics investigations.

I fitted this and other old model data to a Moore’s Law type curve. I was little below the doubling factor of 2.0, instead it was around 1.6 every two years. However, that is still a dramatic increase in element count from a humble 300 elements to the modern 400,000 element model.

Baselining Simple Components

I selected structural components of similar scope in both geometry and FEA technology to create my Moore’s Law for elements. Many of us build similar basic models day-to-day. I ignored more complex structures, assemblies and analysis. I wanted to see if Moore’s Law could predict my basic bracket’s future.

The result was startling for this simple structure. To keep up with Moore’s Law, I should now be running at around 3 million elements. By 2020 this should be reaching 7.5 million elements. So an interesting question now arises: What is the sensible number of elements to use? The 2007 model really topped out as being perfectly adequate for any kind of structural analysis that I would want to apply. Conversely, the 1976 model at 300 elements was inadequate and required a lot of interpretation to try to predict load paths and accurate stresses.

So, for a lot of basic structures we really have more than adequate computing power right now. Sensible meshing achieves a good level of accuracy. Over-meshing by following Moore’s Law does not improve accuracy, but instead fills up computing resource and places a big burden on post processing. This trend is encouraged by decreasing default element size in most FEA meshers and the lack of good local mesh refinement controls in many CAD-embedded FEA meshers.

Better use of available computing resource includes running more design variations for reliability estimates or optimization studies. Postprocessing efficiency is a very big issue when manipulating and viewing results.

Looking Ahead

Despite this mesh fidelity to processing power ratio milestone in finite element analysis, many simulation technologies, such as CFD (computational fluid dynamics), acoustics, crash analysis, and more are not topped out in terms of required mesh fidelity and processing power. For them, Moore’s Law continues to be a vital phenomenon. However, for the basic structural bracket, we have reached our zenith and there is a trend to over-mesh and over-model. Perhaps you, the reader, would like to construct a similar Moore’s Law of elements and forecast the growth of your favorite FEA models.

Share This Article

Subscribe to our FREE magazine, FREE email newsletters or both!

Join over 90,000 engineering professionals who get fresh engineering news as soon as it is published.


About the Author

Tony Abbey's avatar
Tony Abbey

Tony Abbey is a consultant analyst with his own company, FETraining. He also works as training manager for NAFEMS, responsible for developing and implementing training classes, including e-learning classes. Send e-mail about this article to [email protected].

Follow DE
#15248