For decades, technologists and futurists have dreamed of having the means to unite the digital and physical worlds, to enable a well-balanced interaction between humans, machines, and their environment.
In 2020, we’re inching toward making this type of computing – spatial computing – a reality. As spatial technologies, like augmented reality (AR), mapping technologies, and machine learning, become more commonplace – it’s already taking hold in gaming, shopping, GPS – more possibilities are emerging.
At the PTC Reality Lab, the team led by Dr. Valentin Heun is exploring spatial computing in industrial environments and how augmented reality can provide the intuitive user interface at scale – architectural scale.
Augmented reality, as we’ve known it thus far, has been built from a product design perspective – we are using an AR interface to interact with a single object in a single application.
At architectural scale, AR moves into the territory of architects: rooms, buildings, neighborhoods, cities, and all the things within those spaces. At this level, an augmented reality user interface serves multiple users, applications, and objects within multiple spaces.
Spatial computing is the means to better understand how humans, machines, products (and more) move and relate to each other within a space. At architectural scale, there is great opportunity for spatial computing to deliver more powerful insights, especially in dynamic spaces like factories.
Together, spatial computing and augmented reality at architectural scale offer an expanded and real-time perspective on a given space, and the lens on which to view new, high-value information. For this research initiative, the team collaborated with architect Nikolaos Vlavianos from MIT’s Design and Computation Group. His architectural and design background gave the team a fresh perspective – and necessary expertise – as they explored this idea.
Let’s back up for a second and talk about why this research is vital – and why it will impact the way people work in the future.
Think about the way you currently navigate a space that you’ve never been to before. There are foundational elements of the space, such as walls and doors, but also different types of markers to assist in moving around a space safely and efficiently, like visual printed signage or electric signals.
One can think about these markers on a continuum from monumental static items like the wall, to changeable yet static items, like signage, to traffic lights, which change to convey information. These markers are stationary and don’t allow for much flexibility or fully represent the changing nature of a given space – that’s where spatial computing and augmented reality as a user interface in an architecture scale comes in.
Take a modern factory floor. There are robotic arms lining a conveyor belt, workers walking down the aisle, a forklift moving pallets, an automated guided vehicle (AGV) moves about the space. It’s a dynamic, ever-changing space. To a person moving in this space, they rely on markers to help them navigate safely and efficiently. But, the space changes, there could be a forklift blocking an aisle, or a product moved to a different location. These changes result in a loss of efficiency.
A good example is how you use Google Maps. The system is designed to route you to your destination in an optimal way. It uses spatial computing – it gets data from your location, data from other cars, etc. and uses that information to deliver the optimal path via augmented reality. The system has a more comprehensive view of the space than the driver – from the data it’s able to compute that there is a traffic jam ahead and can re-route accordingly.
As a user you feel safe and comfortable following the directions because the user interface acts as a buffer between you and the world. The system helps you optimize your routines. Augmented reality is the user interface for you to see that optimization, spatial computing does the computation.
This is augmented reality at architecture scale.
Let’s consider another example: there’s a fire in a large office building. Hundreds of employees need to evacuate as quickly as possible. Typically, employees would follow signage to the nearest exit. However, this leaves a lot to chance – the majority of people could go to a certain staircase and a backup (and delay) could develop. People could go to the nearest exit not knowing that the fire will block their exit a few flights down.
With spatial computing, data from different sources, things like IoT, cameras, and sensors, is evaluated in real-time and the most efficient way to exit for each specific person based on their location can be identified. Communicating the optimal route could be delivered via push notification on an individual’s phone. The user clicks on the notification and is brought to an AR experience that guides them to the best exit route. The net result is a safer situation for all.
Spatial computing is a new perspective at the world around us, driven by advancement in computer vision, machine learning, and IoT. It enables the seamless interactions of humans, machines, and spaces to optimize processes, and foster real-time collaboration in the future of work.
Within the factory, the applications for spatial computing are numerous. Here are a few examples where spatial computing and augmented reality at architectural scale can support better ways to work:
At PTC, we’re just getting started with architectural scale AR and spatial computing – stay tuned for more ideas and innovations coming out of the PTC Reality Lab. Keep up with the Reality Lab on Twitter: @PTC_RealityLab.