Connecting the physical and the digital – this idea is core to PTC’s technologies and mission; it’s right in our logo and it’s close to our hearts.
Digital twin, digital thread, IIoT, augmented reality (AR)… these concepts and technologies are innovating how workers are doing their job – and changing industries. The technology has come a long way, but there’s a lot more potential.
Consider this: The way we currently engage with technology is primarily through screens, whether it’s a smartphone, a computer monitor, or a machine UI. These screens give us insight and control, but they’re also limiting and often disconnected. Disparate systems don’t talk to one another and can make what should be a simple process into a complicated set of steps. There is a growing need for a new type of human-machine interface (HMI) that provides a universal way for people to understand and control complex, interconnected computing systems: a universal HMI.
The researchers in the PTC Reality Lab are keenly interested in this problem – and are actively working on solutions – particularly with AR – that remove barriers and empower humans to more seamlessly engage with digital technologies. A key component to these solutions is Reality Editor.
I recently sat down with Ben Reynolds – who has been developing the Reality Editor with PTC Reality Lab’s lead scientist Valentin Heun since he was an undergrad at MIT and grad student at MIT Media Lab – to find out more.
“At a basic level, Reality Editor is a system that allows you to visualize information from machines, control and interact with them, even reprogram them, through augmented reality,” says Reynolds.
Now the Innovation Engineering Director at PTC, Reynolds is continuing his work on the Reality Editor in collaboration with his colleagues. A bit of a jack-of-all-trades for the team, he collaborates closely with Heun and team members to create easy-to-use, intuitive user interfaces for machines using augmented reality.
The very early iterations of Reality Editor yielded humble, but thought-provoking demos. For example, using AR to show the songs playing on a radio, or to change the color of a lamp using an AR color picker. These demos showed a first iteration of what Heun later described as bi-directional augmented reality.
These demos were so powerful that four master and one PhD theses would follow to deeply explore the potential of this technology.
Now at PTC, the team is applying this entire scope of researched knowledge within complex industrial systems and processes, but with the same end goal: to enhance human interactions between the physical and digital worlds.
“When you point an AR device – such as your phone – at a computer or machine connected to the Reality Editor platform, a user interface for that machine pops up in augmented reality, directly mapped onto the shape of the machine,” Reynolds explains.
The interface allows the user to do simple tasks like reading a sensor value or turning a motor on or off. But that’s just the start; it has a lot of power and supports more complex tasks like moving digital content between computers in a spatial way, or even drag-and-drop programming to connect multiple machines together.
“AR gives people the ability to interact with machines in new and powerful ways, with relatively little training, which has diverse industrial applications,” Reynolds says.
Check out this video for a quick demo:
There’s a bigger picture here: The current state of technology in the industrial environments is essentially the more the merrier – phones, computers, sensors, machines, and all the smart, connected things. While there are internet technologies (i.e. IioT) to connect the data of these devices, with the current tools each computer or machine still more or less feels like a separate “box“. In other words, a room full of computers still feels like a room full of computers, rather than a larger, interconnected computing environment.
This is an important distinction, and enabling the latter is an incredibly powerful idea for the complexities of the factory.
“Reality Editor attempts to blur the lines between all these separate boxes by giving people a universal tool they can use to interact with all of these physical and digital things in a natural way, rather than going back to a computer terminal and writing code,” Reynolds says.
In many ways, Reality Editor is the glue that ties the PTC Reality Lab’s work together. For example, it can be used to program a robot, and allow it to interact safety with a piece of machinery. Instead of using multiple UIs, Reality Editor streamlines the communication process for the worker.
“We’re fascinated with the idea of breaking down the barriers of different user interfaces and we’re pushing that idea further through different experiments,” Reynolds says. The team is working on something particularly compelling to be unveiled at LiveWorx this June.
In their new space at PTC’s Boston headquarters, Reynolds describes the atmosphere as “controlled chaos” where creativity reigns. “We’re all self-motivated and autonomous, we all love building things, and encourage each other to run wild with ideas.”
Reynolds explains, “I’ll build a prototype, send it to the rest of the team, and it’s exciting to see what kind of reactions they have, then we can tweak it, add to it – and that’s when it starts to get fun.”
Stay tuned for updates and insights on PTC’s Reality Lab here on our blog. Keep tabs on their exciting work by following them on Twitter, @PTCRealityLab.