Blogs How Augmented Reality Bridges the Gap Between Humans and Machines

How Augmented Reality Bridges the Gap Between Humans and Machines

August 7, 2019

Editor's Note:This post was updated in January 2020. 

The Industrial Internet of Things has paved the way for machine automation of simple and routine tasks for smart, connected products (SCP). In parallel, augmented reality is empowering workers and yielding incredible gains in worker productivity. In the future, work will be a mix of tasks conducted by humans and machines working collaboratively together. Augmented reality is the next-generation human-machine interface (HMI) to bridge the workflow gaps in tasks and maximize efficiencies of human and machine interactions.

Automation Replaces Tasks, Not Work

This emergence of a task ecosystem where workers and machines work synchronously is replacing dated automation utopian mindsets like ‘lights out factories’, where even technology pioneers like Elon Musk are citing the importance of human workers and the pitfalls of excessive automation and its detrimental impact on Tesla production. 

Increasingly prevalent are the influx of autonomous systems and AI-powered robotics, which are replacing many manual tasks. This is notably different than stating automation is replacing jobs; McKinsey estimates that less than 5% of jobs consist of activities that are 100% automatable. While a worker’s future daily tasks are subject to major change, their employment status likely is not, as industries such as manufacturing are citing massive forthcoming work shortages. 

Take an assembly worker for example; It is likely that through increasing digitization much of their daily monotonous paper-based documentation activities are automated and some of the assembly process that is unfavorable to them is completed by machines (heavy-lifting, repetitive actions etc.). However, variable and complex tasks like final quality verification of a manufactured product still requires a human expert in the workflow to issue a final stamp-of-approval. 

Human-Machine Interfaces Are Built for Previous Generations of Work 

Traditional HMIs cannot effectively contextualize and interact with these future workflows that now include physical and digital work information. The Boeing 737 is a tragic example of legacy HMIs incapable of managing tasks and interactions between autonomous systems and humans, with faulty physical airplane sensor data triggering a dangerous automated maneuver; the pilot had no interface to interact with this action.

To accommodate for the rapid change of hands for tasks within workflows like in airplanes or in manufacturing processes, robots and humans will require both a collaborative nature with constant cross-exchanging of information and novel methods of interactions. An on-demand, in-context interface that leverages human oversight and instruct capabilities within their surrounding environments is needed to manage this influx of cyber-physical systems and the future of work. 

The PTC Reality Lab is actively working on the next-generation of human-machine interaction. Using augmented reality and spatial computing, a complex task can be done with speed and efficiency through spatial context. Watch this demonstration video to see how Reality Editor is used to program a robotic arm: 

Augmented Reality Is the HMI For Front-Line Workers and Their Machines

Augmented reality (AR) is the purpose-built computer for front-line workers to monitor and now increasingly control and optimize connected machines. This emerging capability is needed as industrial companies are geared towards flexibility and agility yet challenged with downtime.

A cutting-edge example of human-machine collaboration in factories are the intersections of AR for humans and cobots for machines. Traditionally, industrial robots are expensive, fixed, and unsafe for humans to work alongside with. Cobots stick true to their collaborative name by providing a more flexible, low-cost option for manufacturers that frees-up workers to take on higher-level tasks.

AR provides the natural lens for workers to instruct machines within the environment and computation to actuate, actual commands. This could include kinetic machine control and motion programming of a cobot or another industrial robot. Instead of traditional reprogramming robot procedures, where an operator will have to leave the area to update the robot’s configurations and procedures, AR brings a virtual interactive dashboard to the shop floor in a timely and immersive experience, potentially saving thousands in changeover costs. 

Combining data from a SCP with AR for interact and instruct could create a new ‘Learning from Demonstration’ method where a robot’s task is dictated by an operator using AR to tether workflows and anchors in spaces with adjustable constraints based on IIoT data. An example could be programming a machine through AR and instructing it to precisely weld within certain measurements with IIoT giving real-time feedback that the welding machine is misaligned or about to break down. 

The Collaborative Future of Work Is Here

Although at the very beginning of its maturity, this future human-machine collaboration is on display in university labs and AR providers’ research divisions. Brown University’s robotics lab is demonstrating with its Holobot the potential of AR as the interface to dictate robotic movements through voice and gesture commands. 

PTC’s Reality Lab has developed this next-generation AR instruct concept on to its feeder machine. The reality editor brings logic flows through drag-and-drop programming for the operator to dynamically control the feeder and set tasks with IIoT-generated data in-mind. 

ar-feeder-demo-600

The Reality Lab is also extending this interface to mobile cobots with its automated guided vehicle named Frida. The bot leverages kinetic AR and spatial mapping to program its motion in physical spaces. These instructions include path planning from set waypoints, where the bot could execute a designated task. The prototype uses AR for intuitive motions and actions, such as following the user’s position in real-time. In an industrial environment, an operator could use kinetic AR to send a mobile cobot from station to station, carrying heavy loads, a typically strenuous task for humans. 
 
cobot-ar-demo-600
 
Maintaining safe interactions with the humans on the land and in the sky will be a crucial component of cobots.  University of Colorado Boulder’s Atlas University is demonstrating on-the-fly programming of a drone to command its flightpath and ensure it can safely operate in the same workspace as a worker, while completing complimentary tasks. 
 

Final Thoughts

AR will increasingly be the tool for front-line workers to interact with the wealth of digitized information widespread across industrial environments and instruct the machines to complete tasks within it. While guide and visualize capabilities are driving growing adoption of use cases across design, operations, sales & marketing, service, and training, a future frontier for AR innovation will be its untapped potential as an HMI, bridging the work collaboration gap between humans and machines.  

 

state-of-augmented-reality-banner

 
David Immerman

David Immerman is as a Consulting Analyst for the TMT Consulting team based in Boston, MA. Prior to S&P Market Intelligence, David ran competitive intelligence for a supply chain risk management software startup and provided thought leadership and market research for an industrial software provider. Previously, David was an industry analyst in 451 Research’s Internet of Things channel primarily covering the smart transportation and automotive technology markets.

Up Next