The question always comes up at industrial management committee meetings: “What is the real performance of our plants?” In a multi-site group, this seemingly simple question often triggers endless debate. Plant A reports an OEE of 74%, plant B 68%, and plant C 58%. But are these figures comparable? The answer is almost always no.
For industrial groups, harmonizing multi-site OEE (Overall Equipment Effectiveness) is becoming a major strategic challenge. Without rigorous standardization of these key performance indicators, it’s impossible to effectively manage a fleet of plants or prioritize investments.
This article explores how to truly harmonize global equipment performance across multiple production environments. The aim: to transform incomparable data into a genuine strategic management tool.
Why Multi-Site Comparison Is So Complex
The first illusion to dispel: an industrial group does not have one OEE, but many OEEs. Each site has developed its own interpretation of the indicator, creating an industrial Tower of Babel where everyone speaks their own language.
The differences begin with the definition of the reference time. Some plants calculate their OEE on the basis of theoretical opening time, others on the basis of actual attendance time, while others exclude changeover times. As a result, three identical plants show radically different OEEs simply because they don’t measure the same thing.
Availability is subject to equally varied interpretations. A ten-minute breakdown will be considered a micro-stop at one site and a planned shutdown at another. Machine set-up times are sometimes included in shutdowns, sometimes excluded from the calculation.
Heterogeneity of tools and solutions
Data collection tools vary dramatically from one site to another. The historic plant uses Excel spreadsheets, the recent site has a modern manufacturing execution system, and the acquired plant works on incompatible proprietary software. This technical heterogeneity amplifies methodological discrepancies and complicates any data-driven decision-making.
Mistakes that prevent Performance Analysis
Without standardization, multi-site OEE becomes a political exercise rather than a management tool. Sites develop strategies to optimize their figures rather than their production processes.
Category manipulation and cause analysis
The most common drift concerns the opportunistic definition of production stoppages. A site in difficulty will systematically reclassify its breakdowns as planned maintenance to artificially improve its availability. Quality defects are transformed into “product tests”. These reclassifications completely distort the analysis of causes, and make it impossible to identify the real problems causing loss of availability.
Another common practice is the manipulation of reference times. Some plants progressively exclude from their calculations all times when performance is degraded: production start-ups, shift ends. In this way, they obtain flattering OEE figures that in no way reflect actual industrial performance.
War of Numbers and Lack of Visibility
In management committees, each plant manager defends his results by invoking local specificities. These arguments become shields against any comparative analysis.
Discussions go round in circles, and management has no access to clear visibility to identify best practices or detect sites in real difficulty. This lack of standardization generates a general mistrust that poisons relations between sites.
The Six Pillars of OEE Multi-Site Standardization
Successful harmonization requires a methodology structured around six fundamental steps, which must be rigorously applied across all sites.
Unique Definition and Frame Design
The group must make a final decision: which reference time? How do we deal with series changes? This design becomes the single reference for all sites, without exception. The success of the implementation depends on the ability to impose this standard everywhere.
Standardization of Availability and Maintenance Rules
The group needs to establish a precise taxonomy of stoppages: detection threshold, breakdown classification, treatment of recurring stoppages. A ten-minute shutdown for adjustment must be classified identically in Paris, Lyon or Shanghai, enabling true comparative monitoring. Preventive maintenance rules must also be harmonized to avoid distortions in availability.
Harmonization of Performance and Cycle Time by Line
The third pillar is theoretical output. The group must choose a single method for determining the reference rate for each production line. The most robust solution is to use the nominal output validated under optimal real-life conditions, taking into account the number of parts produced per unit of time.
Stop categories and Common Nomenclature
The group must define a single tree of causes: mechanical breakdowns, adjustments, lack of material. This taxonomy facilitates analysis of causes and enables significant consolidation to improve operational efficiency.
Performance objectives and standardized indicators
The fifth pillar concerns the definition of consistent targets across the group. TRS targets must be established according to a common methodology, taking into account legitimate industrial specificities but eliminating methodological biases. These targets become the benchmark against which progress at each site is measured.
Digital Solutions for Asset Performance Management
Practical implementation requires a system capable of imposing these standards. This is the role of a platform like TEEPTRAK, designed for multi-site industrial performance management.
Technical standardization and implementation
TEEPTRAK applies a single standard for calculating OEE at all sites. The definition of OEE, availability rules and downtime categories are configured once at group level and deployed identically. This technical set-up ensures that all sites speak the same language.
Automated Collection and Continuous Improvement
Automated collection eliminates human bias. Stoppages are automatically detected and precisely time-stamped. It’s impossible to opportunely reclassify a breakdown or exclude certain periods from the calculation. The figures become objective, enabling a real improvement in competitiveness.
Comparative Dashboards and Real-Time Access
Dashboards enable management to instantly visualize the performance of all sites according to the same criteria. Real time shared between headquarters and workshops creates total transparency, bringing immediate value to management.
Case Study: Six Plants, Measurable Return on Investment
Let’s take the example of an industrial group operating six mechanical manufacturing plants in Europe. Before harmonization, reported OEEs varied from 58% to 74%, but it was impossible to compare these results from single facility measurements using different methodologies.
The group deployed TEEPTRAK with a standardized definition. Automated data collection eliminated local interpretations. The results revealed that the German plant excelled on availability thanks to exemplary maintenance, while the French plant achieved the best quality results.
These insights enabled targeted skills transfers. Within three months, the worst-performing sites gained five points by applying existing best practices. The Group’s manufacturers were able to use this objective data to prioritize their investments rationally, demonstrating a rapid return on investment.
Conclusion: From Measurement to Performance
Harmonizing multi-site OEE is much more than just a technical project. It is a cultural transformation that moves an industrial group from a logic of autonomous silos to a genuine dynamic of collective performance.
The benefits go far beyond the simple comparability of figures. Successful standardization enables best practices to be identified and deployed rapidly, drifting trends to be detected immediately, and investments to be directed towards the most profitable opportunities. It also transforms relations between sites: transparency replaces mistrust, collaboration replaces sterile competition.
For industrial managers, finally having reliable, comparable data at their disposal radically changes their ability to steer. Strategic decisions are based on objective facts rather than political negotiations. Management committees become moments of constructive analysis rather than exercises in justification.
Multi-site OEE harmonization is not a luxury reserved for large groups with unlimited budgets. It is a strategic necessity for any company operating several production sites. In an increasingly competitive industrial environment, the ability to measure, compare and improve performance on a group-wide scale is becoming a decisive competitive advantage. With the right methodology and tools, this harmonization is within the reach of all industrial groups determined to move from a collection of plants to a genuine performance network.


0 Comments