Human-Machine Interaction in the Age of Industrial Automation

Machine learning drives new industry automation.

Machine learning drives new industry automation.

At Van Hool, a family-owned coach builder based in Belgium, engineers use Oqton software to program the robots for automated welding. Image courtesy of Oqton.


Need a hand with an industrial project? Whether to stuff boxes of snacks or to fill a row of fast-moving vaccine vials, more manufacturers are turning to robots for the labor-intensive, repetitive tasks.

The global collaborative robot (cobot) market is expected to grow from $1.1 billion (2022) to $9.2 billion by 2028, at a compound annual growth rate of 41.5% from 2022 to 2028, according to a report from Research and Markets.

The term “robot” generally conjures up menacing humanoid entities made of metal and silicon, as depicted in popular sci-fi films such as “Lost in Space” and “The Terminator.” The movie versions are not part of the current manufacturing landscape, but what is becoming common is the use of articulated, jointed robotic arms equipped with computer vision.

These so-called arms can navigate a predefined zone and perform required tasks with little or no human supervision. Running on machine learning (ML) code, sometimes they devise strategies even their human overseers didn’t consider to accomplish assigned tasks. The phenomenon adds fresh twists to the ongoing human-machine collaboration, along with new questions.

Defining the Human-Machine Paradigm

“From our experience with most of our customers, cobots work alongside human workers rather than in direct collaboration with them,” says Alex Greenberg, director of robotics 4.0 simulation, Siemens Digital Industries Software. “Cobots are easy to install and program, and you can easily deploy a cobot without the typical investment in safety required with a standard industrial robot. Usually, a light fence and a safety sensor are sufficient.”

“There are some fallacies around cobots,” says Tom Hummel, Rapid Robotics’ VP of technology. In theory, cobots are equipped with sensors so they can stop or avoid collisions when they detect obstacles, including humans.

Often, industrial robots work alongside humans but seldom in direct collaboration with them for safety reasons. Image courtesy of Siemens.

However, “If a cobot moves as fast as an industrial robot, it cannot necessarily work alongside humans because it has so much energy. If a car with really good brakes hits you, the car still hits you,” Hummel adds.

Michael Kahane, chief technical officer of robotic solution provider OSARO, also notes, “It’s one thing to sense the obstacle. Quite another to stop when you do. The faster the robot is moving, the longer it takes to stop.”

Rapid Robotics started off with six employees in 2019 in a small apartment in San Francisco. It deployed its first robot, called Hummel within 6 weeks of founding. At its San Francisco headquarters in the Mission Bay district, most robots are operating inside safety cages, with human operators at a safe distance. The robots are autonomously interacting with packages and products. They hardly ever come in contact with humans.

“Cobots working with humans in the loop are rare cases. If a human is in its workspace, the cobot has to operate slowly, and that actually negates the value of the deployment,” Hummel says.

ISO 10218 specifies that the robot’s speed for safety-rated monitored tasks must be 250 mm/s or less. Some manufacturers design their robots with two modes: one that meets the prescribed speed limit when the robot is interacting with humans and another that allows the robot to move faster when operating by itself.

“There’s a desire among our customers to see more human-machine interactions, but right now, for safety reasons, they have to be separated,” says Gerard Andrews, senior product marketing manager for robotics, NVIDIA. “While the separation makes sense for safety, it also limits the automation currently possible. “If you have to keep a distance from the robot, you can automate steps A to B, but because the human is not there, you cannot take it further. When the safety cages are eliminated and humans and machines can work side by side, then you can rethink your automation paradigm.”

Human-in-the-Loop Automation

Can you do a welding job without feeling the heat? You can, with the software-driven remote-welding system from Oqton, part of 3D Systems. The software allows the operator to plan and program the weld task from start to finish using CAD models. In virtual planning, the operator can identify potential collisions and address them before the actual job begins.

“By 2023, our nation’s workforce will need over 375,000 welders to satisfy the demands of several industries,” according to a statement from the American Welding Society. “The labor shortage is compounded by the estimated 150,000 welders approaching retirement age.”

In NVIDIA’s Isaac Platform, you can simulate warehouse activities with realistic human avatars and conveyors. Image courtesy of NVIDIA.

“With welding, there is heat (skin burn), UV rays (UV burn), ultraviolet light (eye burn) and hazardous fumes involved,” says Mark Forth, general manager of Oqton. The nature of the job and the labor shortage make welding ideal for robotic automation. One of Oqton’s customers is Van Hool, a Belgian family-owned coachbuilder.

“In the past, when we were programming a part on the robot, we used to focus on the programming of the robot itself, but now we can focus on the actual welding, thanks to the Oqton software,” says Pieter Ceulemans, robot welding engineer for Van Hool.

“Oqton software is built for welders, not CAD and simulation experts, so users don’t need in-depth knowledge of robots, 3D geometry or programming,” says Forth.

In 2016, the Siemens Motion Control plant in Erlangen, Germany, deployed cobots to its existing lines as part of a strategic decision to increase flexible automation. One example is the automation of a post-surface mount technology line with the help of 25 cobots from Denmark’s Universal Robots.

“From the start, we included a sponsored degree-level apprenticeship in mechatronics, a parallel training program to address the people who would no longer be required to do the tedious tasks the robots would be taking over,” says Greenberg.

He adds that many of the people could handle advanced tasks, so they became part of the robotic lab, debugging issues and fixing robot codes. “The most interesting thing about that project was: the people were pleasantly surprised to find they had talent they weren’t even aware of,” Greenberg says.

For the project, Siemens used Process Simulate, part of the company’s digital manufacturing software portfolio. The company created a digital twin of the plant to identify and resolve the collision and interlocking problems beforehand.

Process Simulate users can plan the manufacturing process in detail with 3D models of human workers, robots and equipment, which include kinematic and logic behaviors, according to Greenberg.

“Utilizing various capabilities, such as intelligent robot reach and placement analysis, human operations simulation and analysis, collision detection avoidance, automatic path planning, virtual commissioning and seamlessly loading of the simulation in VR enables customers to select the best solution alternative and author robust robotic programs, thus greatly reducing the risk of equipment damage or failure to meet the required production performance,” he says.

In Process Simulate, programming is language-agnostic. The user doesn’t need to be an expert in common robotic languages, such as Fanuc or ABB. “Eventually, the software automatically produces the specific robot code that will be loaded into the robot’s control software, but the programming interface in Process Simulate is generic, which makes it easier. It allows you to program even complex robot tasks for multiple product variants, generate synthetic data to train machine learning algorithms, or virtually commission an artificial intelligence-based machine vision system, with easy-to-use function blocks,” adds Greenberg.

Accelerated computing pioneer and graphics processing unit (GPU) maker NVIDIA offers Omniverse, its immersive 3D simulation platform, as a place to build and house digital twins. Omniverse uses USD, a file format developed by Pixar, as its backbone to ingest CAD files. Often, with digital twins for process simulation, you might not need a detailed model. A representative model might suffice. But incorporating human avatars requires a more thoughtful approach.

“You don’t need to have a precise representation of your workers, but you should have a good representation of your workforce in its distribution of heights, sizes, body types and physiques,” says Andrews.

The self-navigation challenges with cobots are quite similar to those confronted by autonomous vehicles. But the required tasks add another layer of complexity.

“For example, it cannot hit a human, but it needs to move pieces of material and put them in a particular order, and then verify if they’re packed tight enough. To get the cobot to understand its world and the tasks, you really need a framework in place,” he says.

In January 2023, in his blog post on the updates to NVIDIA’s robotic simulation platform Isaac, Andrews wrote, “Significant new capabilities for robotics researchers include advances in Isaac Gym for reinforcement learning and Isaac Cortex for collaborative robot programming. Additionally, a new tool, Isaac ORBIT, provides simulation operating environments and benchmarks for robot learning and motion planning,” according to NVIDIA.

“Isaac Sim’s new people simulation capability allows human characters to be added to a warehouse or manufacturing facility and tasked with executing familiar behaviors—like stacking packages or pushing carts,” Andrews says. “The goal is to make programming cobot behavior as easy as programming a game, because it won’t be scalable if you need a team of Ph.D.’s to program a cobot to do one simple task.”

Ask the Machine for a Suggestion

OSARO offers piece-picking robots, generally used in warehouse automation. The company touts its advanced, ML-driven vision and control software as the foundation for all of its products.

“For us, machine learning is useful for training the robots to find the best pick point, for example,” says Kahane. “Classical computer vision has some weaknesses. Suppose you train a robot to pick a box. If the box is damaged, the robot running on classical computer vision might get confused. In those areas, machine learning is much more robust.”

In one early experiment, OSARO used ML to train a robotic arm to pick up bottles using its suction cup. “The suction cup it had was a little too big to effectively pick the bottle from the top as intended,” Kahane recalls. “So the program figured out another method. It learned to tilt the bottle at an angle to get a better grip. I thought that was pretty cool.”

OSARO now uses robotic arms with three different suction cups, allowing the program to select and switch to the most suitable cup based on the target’s size, feature and orientation.

“This is the trend right now, even if the technology is not yet mature,” says Greenberg. “Eventually, the trained robots would be able to perform the required tasks, even if the targets are not exactly positioned. Just looking at the environment through the camera, they’ll figure out a way to execute the task as long as there’s no obstacle and it’s within its scope of capabilities.”

AI or ML plays a crucial role in Rapid Robotics’ deployments. At present, the role of AI is confined to perception-based work cell analysis and optimization. “Where do we place the part? Is there anything that would cause a safety issue? Is anything blocking the motion path? These are all important questions to resolve and we’ve developed some key technologies for them,” says Hummel.

But Hummel also foresees a much bigger role for AI in the pre-deployment phase. “Things like figuring out where the robot should go to optimize cycle time, where the machines should go, or all the bracketry, the structural material … Can these be automated? What if our deployment engineers can simply specify what they want in plain language and receive a deployment template back? We see that as the future of AI,” he says.

Explaining how AI helps Oqton’s welding-automation software, Forth says, “We don’t pretend to know as much about welding as a professional with years of experience. What the AI does is learn how a particular expert works and over time suggests the best welding practices based on that expert’s preferences.”

The Human’s Changing Role

Automation and the introduction of semi-autonomous robotic limbs don’t eliminate the need for humans, but do change their functions. “The role of the humans is a supervisory role. They are relieved from the menial, injury-prone work, but they monitor the process,” says Kahane.

“I’ve found that robots usually upgrade humans’ jobs,” says Hummel. “We’ve seen operators who were glad to see a robot doing their job because they knew there were other jobs in the factory that were far more interesting and far more valuable for them to be doing.”

When doing an automation project, “the key is to include the people who will be affected from the start. Make them part of the holistic approach,” says Greenberg.

More 3D Systems Coverage

3D Systems Company Profile

More NVIDIA Coverage

More Oqton Coverage

Best Practices In Moving Towards Large-scale Production Planning In 3D Printing
Planning and scheduling for additive manufacturing at a factory scale opens up an incredible array of possibilities, but also corresponding challenges. At the top of those challenges is the fact additive manufacturing - or colloquially 3D printing - tends to...
AM Quality Management Matures 
The 2023 Formnext show was a hotbed of new announcements, many leveraging AI to ensure consistent print quality and traceability for production-scale applications. 
Production-grade AM Gets Quality Boost
Advances in artificial intelligence, in-situ monitoring and process control are picking away at longstanding concerns about additive manufacturing part quality, but more work needs to be done.
Simulating the Way to Consistent 3D Printed Parts
As production-scale AM gains traction, process simulation and in-situ monitoring are keys to consistent quality and optimized performance for 3D-printed parts.
Oqton Partners with Xact Metal for Metal AM Solutions
Companies combine lsoftware and 3D printers to deliver application-specific metal additive manufacturing solutions.
The Hidden Cost of Post-Processing
DfAM is the key to reducing post-processing burden.
Oqton Company Profile

Share This Article

Subscribe to our FREE magazine, FREE email newsletters or both!

Join over 90,000 engineering professionals who get fresh engineering news as soon as it is published.


About the Author

Kenneth Wong's avatar
Kenneth Wong

Kenneth Wong is Digital Engineering’s resident blogger and senior editor. Email him at [email protected] or share your thoughts on this article at digitaleng.news/facebook.

      Follow DE
#27443