Virtual ExperienceJune 21, 2018

Digital Manufacturing: A 70-Year Odyssey, from “Sneakernet” to Machine Learning

This post originally appeared in the Navigate the Future blog. From the birth…
header
Avatar Katie Corey

This post originally appeared in the Navigate the Future blog.


From the birth of Numerical Control (NC) in the late 1940’s—a collaboration between John Parsons and Frank Stulen (both received the National Medal of Technology and Innovation), the MIT Servomechanisms Lab, and the U.S. Air Force to machine the complex contours of aerospace parts—to today’s model of fully digitized ideation through production, digital manufacturing has followed a remarkable path from baby steps to adulthood.

The punched-tape inputs of NC soon gave way, with the advent of the microprocessor, to Computerized Numerical Control (CNC), to direct the automated path of the cutting tool based on manually drafted, then CAD-generated, designs. These single-cell organisms—stand-alone machine tools—were linked via Distributed Numerical Control (DNC), so toolpath data could be distributed from a single workstation to multiple machine tools. The machine tools were aggregated into larger, multitasking work cells. DNC fed information back to the centralized controller, opening up two-way chatter between manufacturing and the supervisory and enterprise layers.

One of the groundbreaking concepts in what is today called digital manufacturing was Computer Integrated Manufacturing (CIM). The term, coined by Joseph Harrington, Jr., in his 1973 book, Computer Integrated Manufacturing, was initially embraced as automation, focused on CAD/CAM, CNC, DNC, and machine tools networked into cells. As computer processing became more affordable, it evolved into an information systems notion—the ability to control the data flowing across engineering and manufacturing. Next came the enterprise view of CIM—linking manufacturing, with all its data, to the rest of the corporation, from marketing, sales, service, and finance to the C-suite.

Much of the nascent effort revolved around setting standards for a unified data structure, to ensure interoperability between different products and systems and reduce “Sneakernet”—people carrying files back and forth on disks. Another emphasis was on relational database technology and query languages, to stop wasting time combining and relating data from separate databases.

A big goal was to push compute power down to the user and workstation—in April 1984, IBM exhibited its first industrialized, hardened PC, the 5531, at a Texas trade show. It’s hard to imagine today the constraints of those platforms—an IBM AT model, configured with 64k RAM and a 20mb hard drive, was considered a beast. But some things never change—it was tough to find people. A report from that time, US Industrial Automation: A View to the Future, stated that ”a major hindrance to the implementation of industrial automation technology in the United States is the lack of qualified personnel.”

One pioneering adopter compared the CIM initiative to the first laps of a mile run—turns out digital manufacturing is more akin to a marathon. But it’s taken an irreversible hold on the business imagination, worked out many of the kinks, and is moving forward rapidly, embracing technology like machine learning—machines learning to tend to the business of making things largely on their own, to “autonomatize” manufacturing so companies can turn on a dime.

Stay up to date

Receive monthly updates on content you won’t want to miss

Subscribe

Register here to receive a monthly update on our newest content.