The two technology layers in a manufacturing facility

IT in a manufacturing context looks familiar: ERP systems, MES platforms, office networks, file servers, email, business intelligence tools. These are the systems that handle planning, procurement, finance, HR and reporting. They run on standard infrastructure – servers, switches, firewalls – and are managed by people with standard IT backgrounds.

OT is the other layer. Operational technology (OT) is the hardware and software that controls physical processes. In a manufacturing facility that means PLCs (Programmable Logic Controllers) managing machinery sequences, SCADA (Supervisory Control and Data Acquisition) systems monitoring plant-wide operations, HMIs (Human Machine Interfaces) on the production floor, CNC machines running cutting and forming programmes, industrial robots executing repetitive tasks, conveyor systems moving materials between stations, quality control sensors checking output at every stage and building management systems controlling environment and access. OT has been around as long as automated manufacturing. The difference today is that it's no longer isolated.

The convergence pressure is commercial. Real-time production data feeding into ERP means more accurate stock, better production scheduling and faster response to demand signals. Remote monitoring by OT vendors and internal engineers reduces downtime. Predictive maintenance – using sensor data to identify a bearing about to fail before it stops a line – requires that sensor data to reach business systems. All of that requires OT to connect to IT.

The challenge is that the two layers have different priorities. IT tolerates downtime for patching and maintenance; OT cannot – stopping a production line costs money by the minute. IT devices get replaced every three to five years; OT equipment runs for ten to twenty years and may run operating systems that can't be updated without re-certification. IT security assumes a patch cycle; OT security can't. Designing infrastructure that serves both correctly requires understanding these differences from the outset.

Network architecture: designing for both IT and OT

A flat network – everything on a single VLAN with open communication between devices – is inadequate for manufacturing. It creates security exposure (any compromised device can reach any other), it gives OT systems no protection from IT traffic and it makes it impossible to enforce the different uptime and maintenance requirements that IT and OT demand.

The reference framework for OT/IT network design is the Purdue Model, which underpins the IEC 62443 industrial cybersecurity standard. In simplified form, it defines separate zones: enterprise IT at the top, manufacturing operations below it, OT and control systems below that and physical devices at the base. Each zone communicates with adjacent zones through defined interfaces, not open connections.

In practice, for an SME or mid-market manufacturer, this means OT devices sit on dedicated VLANs with no direct route to the internet. Strict firewall rules govern what can communicate between the OT zone and the IT zone. Where IT systems need production data from OT, that data flows through a controlled intermediary – not a direct connection between an ERP server and a PLC. Remote vendor access to OT systems terminates at a jump server in a DMZ, never directly into the OT network.

This architecture needs to be specified at the cabling stage. Retrofitting proper OT/IT segmentation into a facility that was wired as a flat network is significantly more complex and expensive than building it correctly from the start. The cable routes, switch locations and comms room layout all need to reflect the zone architecture before a single rack is installed.

Connectivity requirements on the factory floor

Not everything on the factory floor has the same connectivity requirements, and the cabling and wireless strategy needs to reflect that.

PLCs, SCADA servers and CNC controllers are typically wired. Industrial Ethernet – using protocols such as PROFINET, EtherNet/IP or Modbus TCP – provides the reliability and determinism these systems need. Latency that would be imperceptible on an office network can cause a robot arm to miss its timing window. Physical cable eliminates the variables of RF interference and wireless contention that matter when machine control is involved.

AGVs, tablets and handheld scanners are wireless by necessity. Industrial environments make wireless harder than office environments: steel structures cause multipath reflections, motors and inverters generate RF interference and moving equipment needs to roam between access points without dropping a connection. Consumer-grade access points aren't built for this. Industrial-grade APs handle higher temperatures, tolerate vibration, support faster roaming protocols and are rated for the environments they're deployed in.

Wi-Fi 7 (802.11be) improves the picture for high-demand wireless deployments. Multi-link operation allows a device to use multiple frequency bands simultaneously, which improves throughput and resilience in interference-heavy environments. For AGV fleets and dense handheld scanner deployments, this matters. We cover the specifics in our article on Wi-Fi installation for warehouses and manufacturing.

Cabling specification matters too. Cat 6A supports Wi-Fi 7 access points and PoE++ power delivery. In industrial environments, cable runs through conduit rather than open cable tray – both for protection from physical damage and to allow future recabling without the disruption of pulling new routes. Specifying conduit capacity for 150% of the initial cable count is standard practice on new builds.

ERP and MES integration

The IT/OT boundary in data terms is the point where production information generated by OT systems needs to reach the IT systems that use it. Getting this right is where the infrastructure and the software architecture intersect.

OPC-UA (OPC Unified Architecture) is the industrial data standard for this exchange. It provides a vendor-neutral, secure method for OT systems to expose data that IT systems can consume – without direct network connections between the two zones. Most modern PLCs and SCADA platforms support OPC-UA; for older equipment, gateways can bridge legacy protocols. Designing the OT network to accommodate OPC-UA data flows is much easier than retrofitting it.

Direct database connections between OT and IT – where a SCADA system writes directly to an ERP database – are common in facilities that have grown organically, but they create security and reliability problems. A corrupted write from OT can damage IT data. A change in the ERP database schema breaks the integration. These connections tend to be undocumented and fragile.

The MES – Manufacturing Execution System – is the layer that properly bridges OT and ERP. It receives production data from the OT layer (through OPC-UA or SCADA), tracks work-in-progress, manages quality control data and passes completed production records to the ERP. When you need one depends on production complexity; when you have one, it has specific infrastructure requirements: server capacity, database connectivity to both OT and IT networks (through the appropriate DMZ), and enough redundancy that an MES failure doesn't stop a production line.

Data historian systems sit alongside MES in the OT/IT integration picture. A historian collects time-series data from PLCs and sensors and makes it available for analytics and reporting – feeding dashboards, maintenance systems and production performance analysis. The historian typically sits in the OT DMZ, collecting from OT and serving to IT, which is exactly what the zone architecture is designed to support.

Security: the infrastructure decisions that determine your exposure

Security architecture for OT/IT environments isn't a separate workstream bolted on after the network is designed. The security model is built into the network architecture itself. The decisions made at design stage determine the exposure.

Which OT systems need internet access? Very few should. A PLC doesn't need to reach the internet. A SCADA server doesn't need to reach the internet. Where OT vendor updates are delivered remotely, that access should terminate at a defined jump server with logging and session recording – never a direct connection into the OT network. Anything that doesn't need internet access shouldn't have a route to it, and the firewall rules should reflect that explicitly, not by default.

Where vendor remote access is required – and it usually is, for OT system support and troubleshooting – the connection should terminate in a DMZ on a jump server that provides session-level access to specific OT systems. The vendor doesn't get a VPN into the whole OT network; they get a recorded, logged session to the system they're there to fix.

Monitoring OT networks requires a different approach from IT network monitoring. Active scanning – which is standard practice in IT – can disrupt OT systems. PLCs don't handle unexpected network traffic the way servers do. Passive monitoring, which observes traffic without generating any, is the appropriate method for production-critical OT environments. OT-specific security monitoring tools understand industrial protocols and can detect anomalous command sequences that general-purpose monitoring misses.

The most expensive security problems in manufacturing come from infrastructure decisions made without security input. An OT system connected to the corporate network without a firewall. A vendor remote access solution that routes through the IT network with no OT-specific controls. A flat Wi-Fi network where OT and corporate devices share an SSID. These decisions get made at build stage and are expensive to undo.

Specifying IT and OT infrastructure for a new or refurbished facility

The specification process for a new manufacturing facility has a sequencing problem that catches out many projects. IT infrastructure is often specified before OT vendor engagement is complete – which means the IT design doesn't accommodate the connectivity requirements of the OT systems that will run on it.

OT vendors have specific requirements. Some PLCs require multicast routing. Some SCADA platforms need specific firewall rules. Industrial robots may need dedicated switch ports with guaranteed bandwidth. These requirements need to be known before the network architecture is finalised, which means OT vendor engagement needs to happen during the design phase, not after the network is installed.

Power provisioning for OT panels is a common specification gap. OT equipment – industrial switches, panel PCs, server hardware in production environments – has different power requirements from office IT. Dedicated circuits, appropriate UPS capacity and sufficient breaker capacity in the distribution board need to be specified correctly. Retrofitting power is disruptive and expensive.

Comms room location and size matter more in manufacturing than in office environments. OT equipment may require rack space close to the production floor – which means the comms room needs to be physically close, with the environmental controls (cooling, fire suppression) appropriate for production environments. A comms room designed for an office IT infrastructure that subsequently needs to accommodate OT switching and server equipment will almost always be in the wrong place and the wrong size.

Redundancy requirements for OT systems need to be specified explicitly. For business-critical production systems: dual power feeds to OT network switches, network failover topology (ring or dual-homed, depending on criticality), UPS on all active OT infrastructure. These requirements should come from an assessment of what a production line stoppage costs per hour – which makes the cost of redundancy straightforward to justify.

Acceptance testing before go-live is non-negotiable on OT infrastructure. Network performance testing, failover testing, security posture verification and OT system integration testing should all be completed and signed off before production goes live. A connectivity problem discovered in testing is a configuration change. The same problem discovered after go-live is an unplanned outage.

Route B designs and delivers IT and OT infrastructure for manufacturing facilities – from initial specification through to go-live commissioning.

Get in Touch