Defining the Infrastructure for Big Data Analytics

Written By: Steve Dertien
  • 10/25/2015
Defining the Infrastructure for Big Data Analytics

ServerThis is the second post in a series of three addressing strategic considerations for how to approach your organization’s Big Data infrastructure to create business value.

The first post described how smart, connected products have unleashed a new era of innovation and opportunity around the promise of Big Data. In order to extract its real value, manufacturers must embrace an exploratory and experimental mindset regarding data and analytics.

Building and supporting a new global technology infrastructure and capabilities for smart, connected products data requires substantial investment and a range of new skills—such as Internet of Things applications development, data analytics, and Big Data management—rarely found in manufacturing companies.

So why invest? There are four categories of analytics that create business value opportunities for manufacturers across the enterprise. The complexity of the analytics capabilities required and magnitude of the business value opportunity increase across these four categories of analytics:

Descriptive: Analysis of historical data that provides simplistic insight into past activities and performance to understand previous behavior and/or outcomes;

Diagnostic: Utilization of historical data to identify a product failure pattern and determine the failure’s root cause. Once the root cause is understood and diagnosed, the resolution can also be identified through mapping to knowledge management tools;

Predictive: Use of modeling, data mining, and machine learning to analyze both real-time and historical data to predict and anticipate future events based on patterns found in the data;

Prescriptive: Once a future event (outcome) is predicted, a suggested next step and/or decision is identified, evaluated, and can be automatically enabled.

At the core of this technology infrastructure is the rules and analytics engine, which supplies the rules, business logic, and Big Data analytics capabilities that create value from data. Each of the three layers is a building block, creating some redundancy across layers. The infrastructure encapsulates the capabilities required for manufacturers to enable most Big Data use cases. (See below.)

Big Data Chart

The rules engine, not represented in Figure 1, serves as the business filter to express the factual conditions and regulations by which decisions should be made within the analytics engine. For example, an engine that emits an over-temperature alarm five times within an hour should execute a service diagnostic analysis, whereas five engines with the same condition in a one month period should invoke a quality diagnostic analysis.

In contrast, an engine that produces signals indicating low coolant levels and a rapid increase in ambient air, exhaust gas, and coolant temperatures with increasing cylinder pressure levels could indicate a predictive or even a prescriptive failure condition as imminent. The process by which analytics data are sequenced and organized can be used to guide a desired outcome or indicate when human intervention is required. Minimally, this alarm or signal could trigger a service event for a technician, or more substantially be used to alter and change the device behavior to avoid damage or defer service to a more opportune time.

1. Backend Infrastructure required for data acquisition: The bottom layer of the technology stack is responsible for the collection and storage of data from the multiple sources described above, including data governance and storage, data collection and integration, and SQL/noSQL data management.

Because of the volume and variety of this data, and the discovery-natured approach to creating value from Big Data, some firms are establishing “data lakes” as the source for their Big Data infrastructure.

A data lake is a data-acquisition approach that stores and holds raw data in its native format until it is needed and ignores how or why data is used, governed, defined, and secured by the organization. Information management leaders should understand the gaps in this concept—such as semantics, governance, and security—and take the necessary precautions.

2. Analytics capabilities required to marshal and analyze the data: The middle layer is responsible for organizing and preparing data for analysis, and analyzing the data sets via processes and algorithms. There are many varieties of analytics capabilities, and depending on the specific use case, any one or combination of these capabilities may be required.

While not all of these analytics capabilities will be required for each use case, a wide variety of analytics capabilities and solutions are required to transform data into actionable insights.

3. Line-of-Business applications deliver the right information to the right user: Users across business functions need specific information to unlock business value, based on their role and purpose, and those needs will change over time. Within any business system are numerous pre-existing analytics capabilities, but until now many of these function with only human-captured data. The data and intelligence from smart, connected products can be combined to improve existing business systems and processes and enable entirely new processes across all functions in the enterprise.

Companies looking to rapidly and economically build and maintain the applications that enable non-technical users across business functions to create value from data will leverage an IoT application platform to meet the escalating demand. Conventional application development tools and approaches are not designed to work with the unique and evolving requirements of IoT applications and Big Data. An IoT application platform will streamline the development process and integrate the systems and people, as well as the data, in order to make application development and maintenance as efficient as possible.

This is an excerpt from an article first published in Frost & Sullivan’s Manufacturing Leadership Council’s thought-leading Manufacturing Leadership Journal.

  • CAD
  • Industrial Internet of Things
  • Connected Devices
  • Predictive Analytics

About the Author

Steve Dertien

Steve Dertien is the chief technology officer and managing director of the Office of the Chief Technology Officer at PTC. In this role, Steve champions corporate technology strategy across PTC’s products including augmented and virtual reality, the industrial internet of things, software architecture, computer-aided design, and product lifecycle management. Steve also leads the innovation research lab and advanced technology development groups, where he evangelizes state-of-the-art advancements with PTC technology and the business advantages it can provide to PTC’s customers and global partner network.