Peering into the Nano-Bio-Techno Future

Moore's Law turns 45; DE turns 15.

Moore's Law turns 45; DE turns 15.

By Kenneth Wong

 
Peering into the Nano-Bio-Techno Future
Gordon Moore, Intel’s cofounder whose prediction of transistor count increases became known as Moore’s Law.

It was 1995: the year immediately following the debut of Netscape. It also marked the launch of DVD as new media; the birth of Yahoo! and eBay; and the passing of The Grateful Dead’s Jerry Garcia. At the time, PTC’s Pro/ENGINEER was in Release 15, Siemens PLM Software was still known as UGS, SolidWorks was getting ready to ship its first release, and the memorable AutoCAD R14 was still two years away. That year,  at the end of August, just as Windows 95 (codenamed Chicago) hit the store shelves, the premiere issue of Desktop Engineering (DE) went off to press.

  Fifteen years is but a blink in human history, but in the fast-evolving high-tech universe, that’s more than enough time for a digital boom, a gold rush, and a bust, paving the way for a renaissance with new hopes and promises. Some might argue, having risen from the smoldering ashes of perished dot-coms and the financial crisis, the technology providers that remain on terra firma today are true survivors, much stronger for what they’ve endured.

  This month, as DE turns 15, I check in with industry veterans and visionaries to figure out what’s ahead. If technology has (to borrow Thomas Friedman’s words) flattened the world, what risk and opportunities does the Brave New World have to offer?  Some conversations take me into the cloud; others take me down to subatomic planes. Some experts argue that Moore’s Law, whose days are predicted to be numbered, is merely gearing up for a Second Coming. Others direct me a treasure trove of Nature-inspired design ideas, ripe for plunder. While the experts don’t always subscribe to the same views, they seem to be in agreement on one thing: The best is yet to come.

  Sustaining Exponential Growth
For a while, the cyclical computational horsepower increases were almost as reliable as death and taxes. As originally predicted by semiconductor researcher and Intel cofounder Gordon Moore in his paper (“Cramming More Components onto Integrated Circuits,” April 19, 1965), the number of transistors embedded in a single integrated circuit kept doubling every year. (Moore later revised his calculation, asserting a doubling of transistor count every two years instead.) His prediction, now commonly known as Moore’s Law, remained true throughout the ’70s, ’80s, and ’90s. Many hardware and software businesses literally banked on the exponential growth promised by Moore. But Moore himself recently warned that the party may soon come to an end, alarming quite a few souls.

“In terms of size ]of transistor] you can see that we’re approaching the size of atoms, which is a fundamental barrier, but it’ll be two or three generations before we get that far, but that’s as far out as we’ve ever been able to see,” he observed (“Moore’s Law is Dead, says Gordon Moore,” April 13, 2005, Techworld).

  In a paper published around the same time, Intel scientists acknowledged, “The 30-year-long trend in microelectronics has been to increase both speed and density by scaling of device components (e.g., CMOS switch). However, this trend will end as we approach the energy barrier due to limits of heat removal capacity (“Limits to Binary Logic Switch Scaling–A Gedanken Model,” June 2005).

 
Peering into the Nano-Bio-Techno Future
Ask Nature, a site that serves as a repository of Nature-inspired design ideas.

Getting More out of Moore’s Law
Anthony Neal-Graves, Intel’s general manager of workstation group, cautioned that a eulogy for Moore’s Law may be premature. “The semiconductor industry has ideas in place that may extend Moore’s Law for the foreseeable future, past 2025,” he explains. Shrinking the semiconductor’s size beyond its current form (32 nanometer is what Intel ships today), he hinted the following progression could be expected in chip design, with approximately two years’ interval for each generation leap: 22 nm in 2012; 18 nm in 2014; 12 nm in 2020 and smaller as we move along.

 
Peering into the Nano-Bio-Techno Future
A building model rendered in the iray rendering engine. In the future, Autodesk 3ds Max users are expected to be able to produce the same type of rendering using a cloud-hosted cluster.

“]6.5 nm] means a dual processor workstation, which supports 158 Gigaflops today, could support in excess of 28 Teraflops,” Neal-Graves points out. “But you need to recognize it is not all about flops. Delivered performance is all about system balance. That means you need to plumb it correctly in order to deliver the performance that is possible … What is needed is advanced memory hierarchies and greater capacities as well as system bandwidth. These components are at the foundation of delivered performance.”

  Some, like NVIDIA’s chief scientist Bill Dally, have suggested Moore’s Law could only be upheld by moving from serial processing to parallel processing. John Hengeveld, Intel’s senior strategist, suggests, “R&D is in the DNA of our future. It enables new levels of performance and addresses areas such as energy efficiency,  scalability for multi-core, many-core and heterogeneous architectures, system manageability and security, and ease of use. It helps us look at computing in new ways and enables us to explore new ways of delivering it.”

  Whether it’s dubbed parallel processing or many-core architecture, this is the frontier where CPU makers Intel and AMD and GPU vendor NVIDIA will wrestle for market share.

 

21st Century Engineering Challenges
For the last half of the 20th century, JFK’s words that crackled over the airwaves on May 25, 1961, summed up the simmering ambition of the postwar era. “I believe that this nation should commit itself to achieving the goal, before this decade is out, of landing a man on the moon and returning him safely to the Earth,” he charged. It was a call to boldly go where no man has gone before. It was a presidential charge that held NASA engineers’ feet to the fire throughout the Cold War.

Sixty years later, as the Red scare faded away, it was supplanted with the Green threat. Former vice president Al Gore launched a book, a movie, and a movement that would echo the anxiety of the times. Climate change spawned many sustainability—or green—initiatives, some more earnest than others.

A glance at the list of projects at The National Academy of Engineering’s (NAE) Grand Challenges website reveals renewable energy and clean energy as people’s top concerns. According to poll results published by the NAE (after tallying 25,000 votes), the three top challenges are:

1. making solar energy economical
2. providing energy from fusion
3. providing access to clean water

 

Surprisingly, securing cyperspace ranked 14, far below health informatics (9), carbon sequestration (6), and reverse-engineering the brain (4).

Analysis and simulation software makers found a place in the renewable market with thermal and fluid flow simulation packages. Green concerns also gave birth to life cycle assessment (LCA). CAD-embedded LCA tools are still in their infancy, beginning with SolidWorks Sustainability Xpress. Autodesk’s recent partnership with materials data provider Granta is expected to produce another solution,  possibly built around Autodesk Inventor. Independent browser-based software like Sustainable Minds (a partner of Autodesk) also gives engineers a way to evaluate environmental impacts.

What do you think is the next big engineering challenge? Visit DE Exchange or DE on Facebook to tell us.

Clouds Forecasted
Every individual engineer might not have access to a private computer cluster equipped with hundreds of cores, ready to do his or her bidding day or night. But with a high bandwidth connection, every engineer can easily tap into a remote cluster for their parallel processing needs.

  For the most part, CAD modeling remains a sequential problem (it’s not a computing task that can easily be broken into smaller chunks), but analysis and rendering, two of the most computing intensive tasks for CAD users, turn out to be ideal for parallel processing.

  One of the many companies exploring the SaaS (software as a service) model for analysis is Dezineforce, a UK-headquartered company. Through what it describes as “HPC simulation on-demand,” Dezineforce offers users a way to remotely access its HPC infrastructure from a browser and perform engineering analysis and simulation tasks.

  Similarly, global engineering solution and service provider Altair has begun offering access to its own HPC infrastructure for clients who need additional computing horsepower through its PBS Works software suite. Altair is still assessing its pricing strategy for the on-demand HPC services. It’ll most likely be a pricing model based on the number of computing nodes and time required by the customer.

  In the near future, Autodesk 3ds Max users may simply push a menu button to render their scenes in the cloud, on a remote cluster equipped with NVIDIA Tesla GPUs. The new function is powered by the iray renderer from mental images, a wholly owned subsidiary of NVIDIA. Though Autodesk hasn’t specified when it would be market-ready, the tool was demonstrated live at this years’ NVIDIA GPU Technology Conference to an enthusiastic crowd.

“]Cloud-hosted GPUs] are all running exactly the same iray software that comes with 3ds Max,” says Michael Kaplan, vice president of strategic development, mental images. “We can guarantee that the image that you get from ]cloud-hosted iray renderer] is exactly the same, pixel for pixel, as what you would get from 3ds Max.”

  Autodesk is currently previewing a similar function for Autodesk users, called Project Neon. Hosted at Autodesk Labs, Project Neon lets AutoCAD users upload a scene with predefined camera views and render them remotely. Because remote rendering leaves your local machine’s CPU and GPU free, you can continue to work on your machine while the rendering session is in progress.

  The Machine is Your Silent Partner
Brian Mathews, vice president of Autodesk Labs, thinks cloud computing has been pigeonholed. Sure, the anticipated availability of web-hosted clusters at affordable rates will let you design “better, cheaper, faster,” but he points out the trend also makes it possible for your machine to do what he calls “predictive computing” or “speculative computing.”

“Instead of running one simulation over 12 hours, you might compress ]the simulation time] down to a few seconds ]by renting additional cloud-hosted cores]. Now, instead of looking at one alternative, you might look at three alternatives. In fact, you might automate those alternatives, using genetic algorithms,” he says.

  In the not-so-distant future,  Mathews imagines, your design alternatives (say, two versions of a building with different window types) can be automated to cross-breed and spawn offspring, or derivatives. These derivatives can in turn produce more derivatives, thus multiplying the pool of design alternatives for as long as you can compute.

“You can analyze these variations by renting a massively parallel system,” he says. “That way, instead of getting a better design, you’ll arrive at the optimum design.” The cloud, he feels, not only “amplifies your design imagination” but lets you “optimize your design to a level that wasn’t possible before.”

  But going a step further, he imagines the machine (he means both the local machine and the networked cores accessible through it) performing heavy analyses in the background while you refine your design. While you’re working, the machine can also be anticipating your next move (for example, anticipating an extrusion command from you when you finish sketching a 2D profile), and doing some of the required computing for the expected task before you demand it.

  Natural Engineering
Arguably, Nature (that’s Nature with a capital N) is the most seasoned engineer, as it has had millions of years to perfect its concepts and manufacturing through trial and error. So it’s only natural that, prompted by the need for sustainable design, people turn to Nature for inspiration.

  The movement to study and,  when appropriate, duplicate Nature’s best work is known as biomimicry. The best ideas in this field, or course, are to be found in the field: in the water-resistant texture of petals, in the flexible limbs of crabs, and in the water-storing cactuses in the desert. But you may also find a repository of biomimicry projects online at Ask Nature,  a site maintained by the Biomimicry Institute.

  Janine Benyus, cofounder and board president of the Institute, explained, “Imagine nature’s most elegant ideas organized by design and engineering function, so you can enter ‘filter salt from water’ and see how mangroves, penguins, and shorebirds desalinate without fossil fuels. Now imagine you can meet the people who have studied these organisms, and together you can create the next great bio-inspired solution.”

  Though the archive is at the moment sparsely populated, you can read about a number of Nature-driven design projects, such as a London building with large panels of glass, inspired by the ribs and struts of the giant water lily; a car designed with a computer-aided optimization model, inspired by the structural strength of skeletal tissues;  and many others. There’s one clear advantage to stealing Nature’s designs:  they’re not copyright protected.

 

Transistor Growth vs. Horsepower Increase
Anthony Neal-Graves, Intel’s general manager of the workstation group, makes a distinction between the doubling of transistor count in a chip every two years (what Moore’s Law says)  and the doubling of computing horsepower every two years (what many assume would follow from Moore’s Law).

“When we double the transistors, we introduce new microarchitectures that compute more instructions in a given compute cycle,” he explains. “As we doubled transistors and increased the number of instructions, we also increased the clock cycles, but the physics of cooling ever higher frequencies has proven difficult. We can increase frequencies; we just don’t know how to economically cool them.”

This chip-architecture conundrum, he points out, gave birth to the era of multi-core computing, soon to be followed by many core architectures, as a way to further increase the number for instructions computed in a given compute cycle.

Micro Management
Need to incorporate electrical component designs, such as circuit boards, into your CAD model? PTC offers an ECAD-MCAD Collaboration Extension for Pro/ENGINEER (now known as Creo Elements/Pro) users. SolidWorks gives you CircuitWorks, a partner-developed plug-in it acquired in 2008. Similarly, Siemens PLM Software recently announced Mechatronics Concept Designer, an NX-integrated package for those who work with mechanical and electrical components.

  CAD software developers are widening their horizons toward the practice of modeling and simulating electro­mechanical interactions. But another area, one that requires a sharper focus on a smaller scale, remains a blind spot.

  Nanotech, the study of material behaviors at the atomic and molecular level, may hold the key to unlocking the next generation chipset design, expected to be stalled by the physical limitation of how small a transistor can get. Nanotech is also an area largely neglected by CAD modeling software vendors, possibly because it lies beyond their established expertise. Whereas current CAD programs are designed to replicate classical mechanical behaviors, nano-particle behavior may require different modeling engines and a different modeling principle altogether.

  NanoEngineer from Nanorex is one of a handful of programs researchers and scientists currently use to model DNA structures and molecular gears. (For more, read “Nanotechnology Enables Real Atomic Precision” by Susan Smith, October 2009).

  The Mac Effect
In October, when unveiling what he described as PTC’s vision for CAD for the next 20 years, Brian Shepherd, PTC’s vice president of product development, borrowed Apple’s slogan for iPhone and iPad: “there’s an app for that.”

  PTC’s new product line,  branded Creo, is replacing what was known as Pro/ENGINEER, CoCreate, and ProductView. They’ll have to make way for Creo Elements/Pro, Creo Elements/Direct, and Creo Elements/View.

  Contrary to Autodesk, PTC takes a more cautious approach to cloud computing. On more than one occasion,  Shepherd had mentioned, “We’re neither pro-cloud nor anti-cloud … We’re ambivalent, or open, or agnostic about the way we deliver software.”

  Yet, Jim Heppelman, the company’s CEO, has made an offhanded comment that “You’ll probably see ]a Creo app] on iPad before you see it on cloud.” (He didn’t say, however, how he might bypass the cloud as the traditional method for delivering applications to a web-connected device like iPad.)

“What Jim was talking about is the work-from-anywhere idea,” explains Shepherd. “Product development is becoming a 24/7/365 activity. People don’t want to have to go back to the office to their desktops … The beauty of the Creo architecture announced ]in October] is that, we don’t have to deliver an app for every operation on iPad.”

  Borrowing a page from Apple’s iPhone playbook, PTC plans to break up professional CAD operations into smaller chunks, manageable in Apple-style apps (for example, a Creo app for surfacing,  another for rendering, another for markup and visualization, and so on). This new approach, Shepherd points out, puts PTC in a nimbler position to respond to market and customer demands with OS-agnostic or OS-specific apps, which would take less time to develop than porting an entire CAD program to a new platform.

  As portable devices like the iPad and iPhone become field crew’s and road warriors’ preferred communication method, traditionally Windows-focused CAD vendors are making their software Mac-friendly. Dassault Systèmes, for example, strives to deliver its 3DVIA software line to iPad users. Its 2D-specific drafting program, DraftSight, is now available for both Windows and Mac. Similarly, Autodesk has recently released a Mac-compatible version of its flagship 2D software, AutoCAD. The company also delivered iPad- and iPhone-compatible versions of AutoCAD WS, a 2D viewing and markup application.

  Lighter Apps, Heavier Computing
One of the first PTC Creo apps to be delivered, says Shepherd, is a simulation app, ideally suited for parallel processing. “That’s a role-specific app ]meant for an analyst] that can take advantage of distributed computing for certain types of analyses,” he says. “For instance, in optimization studies, you’ll need to solve the same analysis many times with different boundary conditions.”

  Another Creo app slated to appear in 1.0 is a rendering app, also suitable for distributed computing for faster results. “The engineering apps’ appetite for more CPU cycles isn’t going away,” says Shepherd. “People want more realistic simulation, more realistic animation–more realistic everything.”

  If the engineering problem of this—and the next—decade is duplicating reality, perhaps the answer, too, is to be found in mimicking how people tackle complicated tasks in real life. Usually, they break up the task into smaller tasks, assign them to those best qualified to execute them, then collect the results at the end of the project—a perfect example of distributed workflow.

  For more future engineering prognostications, visit DE’s Virtual Desktop blog at deskeng.com/virtual-desktop and listen to recorded conversations with more industry veterans.


Kenneth Wong writes about technology, its innovative use, and its implications. One of DE’s MCAD/PLM experts, he has written for numerous technology magazines and writes DE’s Virtual Desktop blog at deskeng.com/virtual_desktop/. You can follow him on Twitter at KennethwongSF, or send e-mail to [email protected].

Share This Article

Subscribe to our FREE magazine, FREE email newsletters or both!

Join over 90,000 engineering professionals who get fresh engineering news as soon as it is published.


About the Author

Kenneth Wong's avatar
Kenneth Wong

Kenneth Wong is Digital Engineering’s resident blogger and senior editor. Email him at [email protected] or share your thoughts on this article at digitaleng.news/facebook.

      Follow DE
#5845