Design engineering workstations have become extremely powerful—where does it make sense to invest your IT budget?

Design engineering workstations have become extremely powerful—where does it make sense to invest your IT budget?

The Dell Precision family of fixed workstations provides a range of options to meet various design engineering needs. Image courtesy of Dell.

Computing technology has advanced to the point that an engineering or design workstation can be equipped with an outrageous amount of power—enough to fly through design engineering modeling and simulation tasks—for a price. With new multi-core CPUs, ultra-fast GPUs and terabytes of memory, it’s possible to tackle real-time simulation, rendering, virtual reality applications and complex data science tasks using a single desktop or tower unit.

However, all of that horsepower can be expensive. Most companies have to stick to a budget; where does it make sense to max out your computer’s specifications, and which features really don’t need all that power? It depends on the applications your engineers are using, and what types of capabilities you believe they might need in the future.

“What customers want in these workstations is really driven by the features and new software versions of SolidWorks, CATIA, NX or any of those products with new feature sets,” says Tim Lawrence, vice president of operations and engineering at BOXX Technologies. “That’s driven by the competitive nature of the CAD packages. As things become more threaded, they bring in the ability to use multi-core CPUs, and some features have moved toward GPU acceleration.”

The BOXX APEXX Enigma S3 (pictured) is built with the latest Intel Core i7 or 19 processor overclocked to 5.1GHz. The BOXX APEXX T3 features an AMD 32-core Ryzen Threadripper 2nd Gen processor. Image courtesy of BOXX.

In some cases, companies have adopted real-time, ray traced rendering into their workflows or other types of complex operations. Simulation has become more mainstream and is being implemented at levels that weren’t possible previously. Engineering firms are also starting to use blended use workstations that include both traditional CAD, CAE and simulation capabilities, as well as deep learning, artificial intelligence and data science.

“That can mean a shift in balance toward GPU capabilities in the workstation for traditional simulation applications,” says Brett Newman, vice president of marketing and customer engagement for HPC and AI at Microway. “In that case customers typically go with a mid- to high-range Quadro GPU, but we are starting to see that crest up to higher-end Quadro cards.”

The availability of faster GPUs that can work seamlessly with design and simulation software packages that were created to take advantage of those chips has also altered expectations. “NVIDIA has been pushing ray tracing for years, but in some ways that solution was ahead of its time for a large number of users,” says Carl Flygare, NVIDIA Quadro product marketing manager at PNY Technologies. “With the introduction of the RTX line of GPUs, now you have real-time ray tracing so somebody using SolidWorks that wants to do photorealistic renders can do so fluidly. That’s a huge change.”

Generative design capabilities are also driving the adoption of more powerful workstations, as companies add artificial intelligence and other technologies into the design process. Flygare also says that solutions like ANSYS Discovery Live and PTC Creo Live provide real-time simulation capabilities to non-specialist users, which has also increased demand for more powerful hardware.

“GPU-accelerated computing reduces some of the tedious, repetitive tasks that engineers have to do,” says Andrew Rink, marketing strategy leader at NVIDIA. “Rendering de-noising, which is something NVIDIA is pushing hard, keeps designers in their creative flow. They can check out different angles, see that visualization. That type of tool is being adopted across the market.”

That really highlights the different approach users may take to selecting a workstation. “People are wearing a lot of different hats and take on a lot of different tasks,” says Josh Covington, managing director of marketing and sales at Velocity Micro. “We had a customer recently that needed GeForce and Quadro in the same desktop because they were using different applications and didn’t want to have two separate systems. That’s the type of customization that users are looking for.”

Software Drives Hardware

Understanding what hardware components drive the performance of the applications being used will help determine which workstation specifications to focus on when purchasing a new system. “You can spend the same amount of money and get radically different performance if you understand whether the application is more lightly threaded, or if you are doing modeling and assembly work,” Lawrence says. “The CPU frequency drives performance, so you can put your budget toward higher frequency CPUs.”

“We have to determine if you are going to take advantage of multi-threading or single-threading,” Covington adds. “Most 2D CAD design packages are single-threaded, and don’t need as much GPU power. If rendering in 3D, then you need a higher-end Quadro graphics card like the RTX5000 or P5000. RAM is more important if you are working with large files or doing complex simulation. In most cases 32G of RAM is fine; larger tasks may require 64 or 128G.”

NVIDIA and its partners have rolled out new workstations designed specifically for data scientists, analysts and engineers who want to make use of deep learning. Image courtesy of NVIDIA.

If, conversely, users are working on mechanical simulation or using highly threaded solvers, then users can benefit from a dual processor with a high core count. If a company isn’t planning to use the workstation for anything other than CAD, for example, they may not need a top bin GPU.

“We spend a fair amount of time trying to characterize different pieces of software with our customers so we can have good information and evidence to help make that decision,” Lawrence says. “It can be difficult to look in to the future and build the right capabilities into the workstation.”

Newman says that for customers that must balance budget and performance, one big trade-off is usually in core count. “How far can you climb up the core count chain for the application?” Newman says. “You shouldn’t climb up that far in most cases, because it doesn’t yield enough performance for your particular application. A lot of applications are parallelized, but not so much so that you need 20 cores. You can give up so much clock speed at 28 cores that you yield worse performance for some applications. That’s one where you need to understand how the applications work.”

“People often try to jump to too high-end a GPU or processor for what they can yield with it,” Newman adds. “The difference between an entry-level GPU and mid and high-level GPUs is substantive, so what you are trying to do really comes into play.”

In the mid-range of the GPU field, NVIDIA’s Rink says the RTX4000 has emerged as a popular option because it is VR ready, and has enough memory to run real-time simulation tools as well as AI applications.

Bear in mind that as GPU memory goes up, the DRAM also has to increase. “A good rule of thumb is you should have two times the DRAM of whatever the GPU memory is,” Flygare says. With more powerful processors, users also have to consider the costs of cooling, enclosures and power sources/power management equipment.

There are areas where it is possible to spend too much with little benefit, or workstation elements that can be scaled back a bit given advances in other areas of the machine. For example, if you are offloading more of the workload to the GPU, it may be possible to scale back to a single CPU in the device. If software isn’t optimized to take advantage of a large number of cores, then there’s little benefit to throwing more cores at those tasks. People also often overspend on RAM.

Workstation hardware is determined by software requirements. Images courtesy of Velocity Micro.

Firms are also deploying workstations with different types of functionality across departments as roles evolve. “We’re seeing customers trying to make distinctions over what their day-in-day-out workday looks like versus the worst case scenario,” Flygare says. “If it’s a department, they may just configure one or two systems to handle the really heavy lifting, and everybody else is using a system that is configured to run CAD. At the departmental level, they are looking at the kind of diversity of configuration they should have.”

Planning Ahead

Workstations should also be configured with future needs in mind. While designers may not need real-time rendering or simulation now, they may need to add that to their workflows later. There are also changes in how and where engineers work that will affect the types of workstation technology deployed.

“There’s an increased need for mobility and not just with laptops, but the dynamics of working from anywhere can run up against performance and the requirements of higher end CAD packages,” Lawrence says. “In supporting companies with distributed workforces and that have proprietary IP concerns, they may need some capabilities for engineers to work remotely or from home. Being able to take this high-end software and provide access to that from nearly anywhere is a challenge that companies are facing.”

That said, the emergence of affordable, high-performance GPU computing, optimized software and more powerful processors is making these new capabilities much more attainable, even for budget-conscious designers. “Real-time engineering simulation is a game-changing technology,” Rink says. “Every engineer can use that and it’s available today.”

More BOXX Technologies Coverage

BOXX Technologies Company Profile

More Microway Coverage

More NVIDIA Coverage

Share This Article

Subscribe to our FREE magazine, FREE email newsletters or both!

Join over 90,000 engineering professionals who get fresh engineering news as soon as it is published.