Blogs The Prerequisites of AI: Why Data Foundations Matter

The Prerequisites of AI: Why Data Foundations Matter

April 27, 2026
Emily is a Content Marketing Specialist on PTC’s Commercial Marketing team based in Boston, MA. Her writing supports a variety of PTC’s product and service offerings.
See All From This Author

Most AI discussions start in the same place: evaluating tools, models, and analytics capabilities. That’s understandable—those are the visible parts of AI investment. But in manufacturing, the outcome of an AI initiative is typically decided long before those tools ever come into play.

AI depends on something far more foundational: the ability to move data reliably, consistently, and securely across the enterprise. Before a single model is trained or insight is delivered, the architecture underneath determines whether data can actually be accessed, contextualized, and trusted at scale. When that architecture isn’t designed for enterprise-wide intelligence, even the most ambitious AI strategies struggle to deliver lasting value.

Today, the most important AI question is often the one organizations skip: Can our existing architecture reliably deliver the data AI assumes will be available?

In many industrial environments, the answer is no. Not because the data doesn’t exist—but because the architecture was never designed to operationalize beyond isolated systems.

 The AI assumption gap

AI is built on a set of assumptions about data, including that it is:

  • Accessible
  • Consistent
  • Contextualized
  • Deliverable at scale

Those assumptions rarely hold across legacy industrial environments. OT data may be generated continuously, but access varies by site. Context often lives inside individual tools such as SCADA, MES, or historians rather than in the architecture itself. As a result, data that looks plentiful on paper proves difficult to operationalize in practice.

The problem isn’t a lack of data. Many initiatives lose momentum purely because the architectures themselves were never designed to support AI beyond isolated use cases—meaning manufacturers aren’t set up for success right from the start.

When architectures don’t evolve, risk accumulates

Legacy architectures rarely fail all at once. More often, they persist because they’re “good enough” for today’s needs and getting product out the door. But as AI initiatives accumulate, so does hidden complexity.

Each new project tends to introduce more integrations, exceptions, or dependencies tailored to a specific system or site. Over time, point-to-point connections multiply, governance and strategy continue to be overlooked, and securing and managing the environment grows harder with every iteration.

Connectivity: The prerequisite to AI success

Connectivity is often treated as an afterthought—something to think about in the last mile of the project, when upstream systems have already been determined, and pilots and projects are well underway. In practice, the opposite is true: connectivity determines whether AI can scale at all.

Modern industrial connectivity creates the foundation AI depends on by:

  • Abstracting devices from applications
  • Normalizing and aggregating OT data into a single access point
  • Delivering information securely across environments
  • Supporting modern, event-driven architectures

Rather than acting as a simple transport layer, connectivity becomes the structural backbone that turns fragmented signals into usable enterprise data. Without that backbone, AI initiatives remain localized—capable of generating insights in pockets, but unable to drive impact reliably across lines, systems, or sites.

Why standardization changes the trajectory

The shift to standardized connectivity fundamentally changes how organizations approach AI and data initiatives. Instead of rebuilding data pipelines for each new initiative, manufacturers can begin to reuse trusted data flows and repeat architectures across plants.

This repeatability is what enables confidence. When data access is consistent and governed by design, teams spend less time managing integrations and making sense of data and more time advancing outcomes. AI moves from experimentation toward operational success—not because models improved, but because the foundation finally supports scale.

The real AI differentiator

As AI expectations continue to rise across manufacturing, the true differentiator won’t be who adopted the latest tools first. It will be which organizations embraced the data foundations that make scalable intelligence possible early on.

When connectivity is standardized, secure, and scalable, data stops living inside silos and starts behaving like enterprise infrastructure. That shift allows AI initiatives to build on one another rather than resetting with every new project. At Hannover Messe, Max Ramírez, Director at Infoportal, reiterated that “Data preparation is key, and the first step before AI can be applied.” Once that’s done, he said, the “opportunities are endless.”

For manufacturers moving beyond AI experimentation, the challenge now is less about why architecture matters and more about how to evaluate their own foundations honestly. Where is data still fragmented? Where does context live? And where do today’s connectivity decisions limit what AI can realistically deliver tomorrow?

Topics Artificial Intelligence Connected Devices Digital Transformation IT/OT Convergence
Up Next

Assess your connectivity foundation

Explore our latest infographic to assess whether your connectivity foundation is built to support enterprise-scale AI—and what must evolve before AI expectations rise any further. Learn More
Emily Himes Emily is a Content Marketing Specialist on PTC’s Commercial Marketing team based in Boston, MA. Her writing supports a variety of PTC’s product and service offerings.

Continue Reading