Evolution of Sensors & Embedded Systems

Better sensors and embedded systems development tools enable better design. That trend has brought us far, and it will continue. Here's why, and how.

Better sensors and embedded systems development tools enable better design. That trend has brought us far, and it will continue. Here's why, and how.

By Barbara G. Goode

 
Evolution of Sensors & Embeeded Systems
Fig. 1: The 657HT Isotron accelerometer—which senses acceleration in three directions and withstands temperatures to 347° F—is an example of the ability to pack increasing functionality and performance into the same small package.

Dramatic. That word describes not only the evolution of sensor and embedded systems technologies over the past 15 years, but also the impact that this evolution is having on design engineering. Furthermore, it characterizes the influence that current and imminent developments pose for the future of design.

  The growth of sensors and embedded systems technologies “is nothing short of amazing,” says Tom Ales, a Kimberly-Clark Corp. research scientist. Fellow research scientist Shawn Sullivan concurs: “And this is just the tip of the iceberg.”

  Many Reasons for More Sensors
Smaller size and more modest power requirements, combined with improved reliability, greater capability and ease of use and better integration, add up to more sensor options for designers.

“Sensors are much smaller …  and also more integrated,” says Alex Gomez of Boston Engineering, whose company provides design engineering services for customers in a range of application areas. “In the past, you needed three accelerometers to measure in three dimensions. Now you can get all three in one package.”

  An example of this is the Endevco Model 657HT Isotron triaxial accelerometer with low-impedance output. (See figure 1.) But according to Scott Mayo—who, as applications engineer for Meggitt Sensing Systems, supports the Endevco brand—this product represents another important advance: the ability to withstand extreme environmental conditions.

“Traditionally, this type of sensor has been limited to a maximum temperature of 248°F because of the electrical components,” Mayo says. “Now, we are able to continuously operate at 347°F.” The effort to push the boundaries on temperature is tied to the fact that jet and gas turbine engines can run efficiently at higher and higher temperatures, he explains. Mayo says that a lot of research continues to go into this effort, and into making sensors easier to mount, install and interface with.

  Sullivan calls attention to the importance of the IEEE 1451 standard, which had a goal of developing network- and vendor-independent sensor interfaces. The effect of this,  according to Jamie Smith, director of industrial and embedded product marketing for National Instruments, is that because components can work together more cleanly, people doing monitoring and control “can take off-the-shelf components and integrate them into a system with easy-to-use software.”

  This, Smith says, “is all driven by advances by sensor and semiconductor companies for consumer and general-purpose applications.” As an example, he points to video game remotes,  chock-a-block full of sensors and able to communicate with the game console in multiple ways. Designers of such systems, he notes, have taken advantage of small size, low cost and low power requirements.

 
Evolution of Sensors & Embeeded Systems
Fig. 2: This “portable toaster-printer” is envisioned to download news from your computer and toast it into your breakfast. This finalist in the Electrolux Design Lab 2008 competition hints at the outcome enabled by inexpensive sensor options. By adding in-demand features without adding cost, manufacturers can keep ahead of the competition.

Indeed, says Dan Spohn,  regional sales manager for Kaman Precision Products, some sensors are actually decreasing in terms of features to reduce prices “enough to get selected for the next iToaster.” (See Figure 2.)

  Smith says the iPhone is a good example of a device that incorporates multiple sensors, including haptics controls, “that would never have been possible 15 years ago.”

  Gomez characterizes the effect of these developments as a “luxury”—meaning he can afford to add more sensors to a design than he otherwise would. In addition, he adds, sensors have improved in terms of reliability. Many incorporate controllers, communications and various peripherals on board (think system on a chip), and can regulate their own power. In fact, some can even supply their own power. These characteristics allow civil structures to reap the benefits provided by low-power sensors placed under the pavement of roads, or embedded into the concrete in bridges, to detect properties such as vibration and strain.

“Scientists and engineers are measuring more and more parameters—because they can,” Smith says.

  Mayo agrees. “Engineers are starting to realize that the more information they have, the better the view they have of operations,” he says, noting that massive system failures—he uses the example of the space shuttle—can prompt a move to add more sensors to gather additional information. Embedding sensors into the wing of the craft enabled better-informed decisions, he says. In this case, the cost of the sensor was not the driving issue, and neither was the novelty of the technology, which actually wasn’t new. It’s just that nobody had previously considered the sensor for that specific use, but once they did, it seemed a worthy addition.

  Easy does it
Smith says engineers are adding sensors not only because they can, but “because doing so is easier, with automated measurements triggered by smart software.”

  Spohn agrees. “The cost of implementing an embedded system design has decreased, which has opened up the potential to apply embedded systems into more—and less costly—products,” he adds. “The competition is fierce in the lower-priced products, so a small edge like an additional feature can be the key to a large market share.”

  Part of the reason the implementation cost has decreased is that embedded system design and programming is so much easier to accomplish.

“The tools for developing embedded systems are vastly improving,” says Sullivan, who notes that even low-end CAD systems are offering embedded systems capabilities. He adds that verification systems are more powerful and accurate, and far more efficient.

“Fifteen years ago, a specialized software engineer was required to do the programming, and everything was done in assembly language,” he says. “Now, even a junior engineer right out of college can do that.” 

 
Evolution of Sensors & Embeeded Systems
Fig. 3: Smart grid switches and reclosers aim to make the electrical grid as robust and self-healing as the Internet. To achieve this, they must automatically detect problems, isolate faults and redirect power. An NI CompactRIO embedded system (left), the “brain”  of the switch, provides a field reconfigurable platform for smart grid R&D by performing multiple measurement, signal processing and control processes in parallel and in real-time. On the right is the output of a CompactRIO-based transformer monitoring system.

Whereas prototyping and analog systems adjustment used to require a lot of effort, now it is easy and “not frightening for someone, even if they are not well-versed in embedded systems,” Sullivan explains, adding that this fact “enables the use of more sensors.”

  Ales states that earlier in the decade, within just a couple of years, a number of low-cost kits ($200 or less) became available that opened embedded systems creation to the world. “People realized how easy it was to construct their own code,” he says, adding that “reference designs exploded—whereas this had been very niche.”

  Everything—signal conditioning, calibration, error correction—became “so much easier, and it was a very short hurdle to learn the ins and outs,” Sullivan says. Lego’s Mindstorms has even brought embedded systems development to grade schools, he points out.

 

The Digital Effect, and More
For applications such as robotics, in which weight is a critical factor, minimal wiring is just as important as miniature sensor size, says Boston Engineering’s Alex Gomez. He mentions “1-wire”—a name trademarked by Maxim Integrated Products—as technology that allows him to avoid using a large wiring harness. The advance is dependent on digital operation.

Digital offers other advantages, too, says Gomez, including an improvement in accuracy. Whereas sensors once generated analog serial output, today’s sensor output is digitized on the chip and goes directly to the controller, without the need for a converter. This means not only a cost savings as a result of fewer components (and simpler design), but also less noise, and less opportunity for data to become corrupted.
Another nice aspect of digital domain, says Gomez, is that you can plan for expansion. “It’s never as easy as ‘plug and play,’” he adds, but it’s no longer such a big deal to add on, to supply more functionality with additional sensors.

The trend of what National Instruments calls Moore’s Law for Instrumentation and Control will continue,  says Smith, because “the size and power of some measurement and control systems has dropped by greater than 90% over the last decade.” Another trend that will continue is functionality enabled by wireless communications—game consoles with the ability to update their own firmware is just one example.

“We get a lot of requests for wireless,” says Mayo.

  The Future is Bright
In fact, wireless is an integral part of the future evolution of sensors and embedded systems—which touches everything from personal products to vast systems and structures.

  For starters, says Gomez,  “the home automation market seems ready to explode.”

  Sullivan agrees, noting that in general, home products fall into two categories:

• Essential needs—This includes medical devices and similar products, to help people in areas of disability.
• Richness—This includes interactive TVs, audio systems and similar devices that people can live without, but that help them to better enjoy their lives.

 

  These two can overlap. For example, when a continuous glucose monitor flags your blood sugar as being too low, a popup reminder on the TV could suggest proper action. This kind of pervasive healthcare, paired with “smart home” capabilities (that also provide an alert when the stove is left on, for instance) promise to help people remain independent longer.

  As an extension of this, Ales says further connectivity could allow people to wirelessly access their information—including, perhaps, health records—from wherever they are. The data cloud exchanges information with sensor-based systems to learn your routines and predict your needs.

  For this future to take shape, though, consumer acceptance is vital. So while Ales notes that while today’s older population may shy away from such technology, those now in the 30 to 45 age bracket are accustomed to adopting gadgetry and will see the benefits. For younger generations, there will be even more acceptance; this is the generation that Spohn predicts will be turning appliances on and off,  watering the lawn, etc., from their hotel rooms while on travel, for example.

  Gomez foresees a big push in the clean energy sector—a push evident in the work of WindLift, LLC, developer of low-cost, next-generation “tethered airfoil” wind turbines. Airborne wind energy systems have potential advantages over traditional turbines, including the ability to capture more of the available wind energy; the ability to scale to larger sizes and output ratings; and heavier duty, less-expensive transmission/generation systems (which can be located on the ground instead of a tower).

  Though the tethered airfoil concept was patented in the late 1970s, it has not been in production before because the necessary component technologies were not previously available at reasonable prices. These include:

• advanced wireless sensor networks and instrumentation;
• real-time computing for the autopilot system;
• reconfigurable,  field-programmable gate hardware that eliminates the need for expensive and costly custom circuit board designs; and
• sophisticated software algorithms for controlling power generation and grid synchronization.

 

  WindLift is found in National Instruments’ CompactRIO, a “convergence platform” that integrates these technologies and combines them with system-level software development tools. According to WindLift controls engineer Matt Bennett, the platform provided the needed power, flexibility and functionality—and the ability to seamlessly transition from prototype to production with the same hardware and software.

  NI’s Smith adds that renewables are an important part of the power generation and distribution area, which he identifies as one of the top frontiers for sensors and embedded systems. He describes, for instance, an electric grid that continuously monitors itself and effectively integrates energy being generated by traditional means, along with that being produced by new, renewable sources. Such a vision, he says, is “only possible if smart grid analyzers are able to monitor the grid, talk to each other, and make changes dynamically.” (See Figure 3.)

  Further ahead, Smith sees such systems able to sustain themselves indefinitely by harvesting the solar,  thermal or vibrational power that is now wasted. 

  Smith calls life sciences another important frontier, pointing out that embedded systems and sensors are key to the operation of advanced medical equipment.

  Gomez adds that long-term environmental and structural monitoring will be in increasing demand—for detecting seismic activity from within buildings and bridges, and for tidal wave detection from buoys, to name just two examples. This application will depend on low-power, “green” sensor systems, as will the “smart dust”—sensor systems that can be sprinkled from helicopters and instantly create ad-hoc sensor networks on the ground.

“That will happen soon,”  Gomez says.

  Spohn suggests that with automated manufacturing—an application in which sensors are already established—there will be no downtime, because sensors are becoming inexpensive enough to enable redundancy in design. He says an embedded system will handle switching and notify technicians to change out the bad parts, which will then stand as the new backups.

“What will be important in the future with sensors, I believe, are things like resolution and repeatability—aspects that are inherent in the sensor design,” says Spohn. This is in contrast, he adds, to things that change with the environment and for which an embedded system can compensate.

  Smith says that increased performance will require parallel or multi-core processing.

“What will continue to make it all work is a stable software interface,” he says. He admits that architecture changes will pose continual challenges for systems design and operation, but middleware that effectively abstracts the complexity of these advancements will make their benefits accessible to an ever-broader audience of engineers, who will harness that power to further innovate in their own fields.

More Info:
Boston Engineering
Kaman Precision Products
Kimberly-Clark Corp.
Lego (Mindstorms)
Maxim Integrated Products
Meggitt Sensing Systems
National Instruments


Barbara G. Goode served as editor-in-chief for Sensors magazine for nine years, and currently holds the same position at BioOptics World, which covers optics and photonics for life science applications. Contact her via [email protected].

Share This Article

Subscribe to our FREE magazine, FREE email newsletters or both!

Join over 90,000 engineering professionals who get fresh engineering news as soon as it is published.


About the Author

DE Editors's avatar
DE Editors

DE’s editors contribute news and new product announcements to Digital Engineering.
Press releases may be sent to them via [email protected].

Follow DE
#5852