LIVE from CAASE18: AI is Easier Than You Think

Why AI may be easier to implement than you think, and how it can benefit industry 4.0 and digital twin projects, from data scientist Jerry Overton's keynote at CAASE 18.

Data scientist Jerry Overton discusses the role of AI in engineering, manufacturing, and digital twin projects (image courtesy of Jerry Overton).

On the second day at CAASE 18 (Conference on Advancing Analysis & Simulation in Engineering, June 5-7, 2018 in Cleveland), cohosted by NAFEMS and DE, Jerry Overton from DXC Technology took the stage to convince the show's 500+ attendees of something many might find unbelievable: AI (artificial intelligence) is cool, useful practical, and — perhaps most important — do-able.

“Aside from being a consultant, I also dabbled in AI, so I thought, Would it be possible to create an AI version of myself, an algorithm that would answer questions the same way I would?,” recalled Overton, the author of Going Pro in Data Science (O'Reilly).

The result is, as he demonstrated on stage, an AI or AI-like chatbot that responds to text-input questions by searching, scrapping, and borrowing words and phrases from Overton's own published blog posts, tweets, and writings.

In that sense, AI in its most basic implementation could be an intelligent search engine that retrieves relevant data and present it to the users on-demand. This principle could be applied to product data management, supply chain management, factory 4.0 and beyond, by simply augmenting existing legacy setups with smart algorithms and ubiquitous connectivity.

But true AI goes beyond that. Overton explained, “When I say artificial intelligence, I mean something that, if you were to see a human does the same thing, you'd acknowledge that person as smart,” he added. “Usually intelligence requires some type of learning. This is why machine learning is at the heart of AI.”

Data scientist Jerry Overton discusses the role of AI in engineering, manufacturing, and digital twin projects (image courtesy of Jerry Overton). Data scientist Jerry Overton discusses the role of AI in engineering, manufacturing, and digital twin projects. Image courtesy of Jerry Overton.

AI Under Supervision

First, you need a set of data. AI learns from observing historical records and archival data (say, equipment failure incidents from the last five years) to understand what represents normal and healthy states, then learn to infer what represents anomalies that you need to address.

“Algorithms can find patterns hidden in data, and even learn the patterns from the data. The learning can be supervised, or unsupervised,” said Overton. “Supervised machine learning is, when I direct the learning to achieve a specific end.”

For example, feeding the AI system millions of photos of cats or gaskets so the program, over time, learn to recognize cats and gaskets from their shapes.

“Algorithms are really good at some things people are not so good at, like finding patterns in tons of financial transactions,” said Overton.

In such unstructured, unsupervised learning, the algorithm or program figures out on its own the correlations and patterns in the data.

There's also cognitive computing, or computation that's designed to mimic human interactions, such as natural language processing. When machine learning occurs in interconnected computing clusters, called neural networks, it's usually called deep learning, explained Overton.

A combination of these AI methods could lead to new possibilities, such as CAD programs that can suggest an appropriate design based on load cases and stress patterns; or factory equipment that can anticipate its own pending catastrophic failures.

Sensor-Driven, AI-Powered Digital Twins

Digital twins, or virtual counterparts to physical products operating in the field, rely on streams of real-time data from sensors. The data represents a gold mine of intelligence an AI program can analyze and dissect.

With sensor-equipped factories, “You'd essentially have a digital representation of your physical manufacturing process,” Overton pointed out. “Now you can try out all kinds of what-if scenarios, playing with new materials and processes that you might never try in a real setting.”

Whereas such experiments in the real world come with the risk of costly failures, a simulation of the same on digital twins would cost only compute power, relatively inexpensive with on-demand providers such as AWS.

Can AI Understand What Humans Can't?

During the Q&A, Keith Meintjes, executive consultant for simulation, CIMdata, asked, “What if I have a set of data, but I don't have any underlying theories about the governing physics and such?”

“There are techniques for dealing with that,” said Overton. “It's unstructured exploration. There are software that can look at your data and provide you with possible theories.”

Meintjes wondered if AI might make the same kind of mistakes humans tend to make with patterns—inferring causal relationships between incidental patterns—such as the presence of clusters of diseased patients in a certain geographic region, where there may not be a causal relationship between the place and the disease.

The incorporation of AI or AI-like algorithms into design software has resulted in a new phenomenon, known as generative design (GD). Leading software makers such as Autodesk, Dassault Systemes, Altair, and others have developed variations of GD.

GD proponents argue GD cannot replace talented human engineers; it's merely intended as a tool to perform repetitive design and analysis tasks, leaving the human designer as the final arbiter of the fate of the software-proposed design.

But that may not allay the fear of those who have watched the dystopian Terminator film series one too many times.

For more on AI and Overton's thoughts, read “The Dawn of AI Bounty Hunters,” January 2018.

For More Info



Share This Article

Subscribe to our FREE magazine, FREE email newsletters or both!

Join over 90,000 engineering professionals who get fresh engineering news as soon as it is published.

About the Author

Kenneth Wong's avatar
Kenneth Wong

Kenneth Wong is Digital Engineering’s resident blogger and senior editor. Email him at [email protected] or share your thoughts on this article at

      Follow DE