Archive | June, 2008


6:00 am
June 1, 2008
Print Friendly

Building Cultures of Reliability-In-Action

It’s time to talk. Getting where you want to be requires stepping away from traditional methodologies.

In the first article of this series (December 2007), the author covered the underlying assumptions of cultures-in-action and how human reasoning and resulting decisions impact performance and reliability. In the second installment (January 2008), he addressed how functional Collaborative Design tools contribute to creating a culture of reliability. This month, he discusses the implementation process of Collaborative Design and how it sustains a culture of reliability-in-action.

The ability to sustain a culture of reliability-inaction rests in the ability to create informed choice in decision making based on balancing control through expanded discussability. The result is the co-creation of psychological safety for all involved. To surpass current levels of performance requires uncovering hidden performance bottlenecks. Many teams sincerely believe they are open and honest, yet remain blind to the deeper assumptions and issues inhibiting performance.

Collaborative Design is most effective when the stakes, either in substance or perception, are high. Implementation of such a high-performance system calls for going beyond traditional change and training methods. Requirements are:

  • Collecting cultural action data (not survey data) to document decision-making patterns-in-action. [Ref. 1]
  • Using functional tools to reflect on personal contributions to effective and ineffective decision making and the resulting team co-creation.
  • Determining the business impact of daily decisions.
  • Designing psychological safety checks and balances to assure the productive expansion of discussability and the uncovering of hidden assumptions. For example, the underlying fear of letting the vice president down can be as costly as fearing career implications for a failed project.
  • Continually monitoring human decision-making patterns by institutionalizing reflection time. Examining what is happening in the human decision- making context is as important as examining the equipment and process performance data— perhaps even more important.

These criteria reflect the same plan as do the check cycles we have come to know. Where Collaborative Design differs, however, is in using functional tools to validate the productive expansion of discussability, while examining underlying assumptions and their associated costs from the get-go.

Participants learn how to work from their internal dialogues (what is thought or felt but not typically verbalized, including tacit knowledge). This approach fosters more accurate hearing of inference, resulting in a shifting of understanding about how decisions-in-action are created. The result is coming to understand the distinction between advocating a strategy (an espoused theory of what needs to done) and what it takes to produce the strategy. This sets the table for profound change and increased performance.

More precise data is available including: untested theories, standards and emotions resting in peoples’ heads (about leadership style, personal effectiveness, what is motivating others, etc.). These belief systems are safely revealed and the underlying assumptions informing them are extractable and manageable. Without uncovering the underlying reasoning, it is highly likely that the culture and its fear patterns will define what change is acceptable, rather than root-cause change of the culture. Instead of learning about performance bottlenecks six months or a year down the implementation path, teams uncover and manage issues early. This is preventive maintenance at its best, but applied to the human decisionmaking system.

Scary and exciting
The examining of decision-making-in-action is both scary and exciting for those first exposed to Collaborative Design. Many theorists, managers and teams believe they are honest and open—nothing is undiscussable, they typically relate. What a humbling experience it is when Collaborative Design reveals that what they say and what they do are different and that this misalignment impacts performance.

Outage lessons-learned sessions or root-cause analyses (see “Why Some Root-Cause Investigations Don’t Prevent Recurrence,” by Randall Noon, Maintenance Technology, December 2007) for example, often can fall short. That’s because many of the most important topics are not discussed in a public forum, but rather in hallways, private offices or parking lots, thus fragmenting concerns and issues and hindering learning. When carefully examining human reasoning and decision making-in-action, users of Collaborative Design quickly come to realize cultures can vary, but underlying human reasoning and assumptions vary little.

Collaborative Design integrates management development and business applications into one compact business system. Team-building, leadership, continuous learning, self-assessment, etc. are not fragmented out into separate subject matter in the hopes that some skills will transfer to the job. Work management processes, defect elimination, RCM, improved outage and turnaround efficiencies, better sales calls, enhanced managerial leadership and coaching competence are fertile ground for Collaborative Design because all of these business applications rest on human reasoning and the decisions that result from it (Fig. 1).

Perhaps most importantly, Collaborative Design points out that misalignment-in-action is not due to some character flaw or innate human badness. Rather, the power of Collaborative Design rests in its promise to productively reveal assumptions that typically aren’t questioned.


Implementation of the basic Collaborative Design process is as follows. While there are important nuances, not all can be explored within the scope of this month’s article.

Role of the Invitationalist…
To start, you can’t do it alone. A knowledgeable, external “Invitationalist” (part teacher, facilitator, consultant and mutual learner) who can quickly verify his or her competence in functional tool application. Can the teacher ethically walk the talk? The role of the Invitationalist is to:

  • collectively establish a common dictionary of terms.
  • collectively establish a definition of valid data.
  • support the introduction of data-collection-in-action.
  • help build action cases revealing decision-making-inaction.
  • assure a reasonable test of the functional tools and learn while validating skill transfer to a core internal group. This is especially important early on because learning a tool for the first time requires making mistakes and learners can quickly blame the tool, rather than their inability to use it. This is like blaming a tennis racket or golf club for limits in our game.

Without an Invitationalist modeling tool application, productively uncovering limiting, underlying assumptions and undiscussabililty is unlikely.

Steps in the process…
Initial introduction of Collaborative Design starts at the executive level. The speed and precision of the installation are directly proportional to the level of executive involvement. No big surprise. The process begins with the steps in “Phase 1: Individual Development” (refer to Fig. 2 below).

0608_culturechange3STEP 1
As a starting point, conducting the Learning Exercise is essential. This unique activity creates an invitation by setting up an informed choice to learn. It is a fact finding and definitional process, combined with a peek under the blanket, revealing the vision and potential of Collaborative Design and its functional tools. The Exercise uses learner data, introduces the notion of internal dialogue and private reasoning, establishes effective and ineffective decision making patterns, and drives down the anxiety associated with mistakes and costs out the impact of private reasoning and undiscussability.



0608_culturechange5STEP 2
Based on the Learning Exercise experience, the Invitationalist and the group begin to practice Collaborative Design from the start by designing the project plan and assuring a reasonable project timeline for learning functional tools. The objective of Phase 1 is to validate tool application in daily business. This application prepares the first contingent of participants to learn how to learn from direct experience— something that is crucial for skill transfer and future sustainability, since functional tool users actually experience the value of application and its dilemmas.


0608_culturechange6STEP 3
Applying Collaborative Design, participants learn how to use audio taped data to collect cultural decision-making datain- action. Taped data, when properly introduced and managed will meet confidentiality and legal requirements. Participants audio tape record selected meetings in which they participate; just like monitoring equipment in action. Action data is important and fosters the quickest learning because it doesn’t rely on someone’s singular interpretation of a crucial meeting. Instead, it provides a directly observable record that can be publicly examined, leading to more than one interpretation. Participants can determine the root cause of their decision-making and behavioral gaps, and can begin to hear their application of the functional tools as they seek to close gaps and measure the value. This is critical for validation.



0608_culturechange8STEP 4
With Collaborative Design Case Analysis Tools, each participant creates a compact action case [Ref. 2] based on a selected decision making point deemed important by the participant. Using the action data, participants meet one-on-one with the Invitationalist and seek to uncover their root-cause assumptions, personal issues, patterns and the business costs of their decisionmaking- in-action while practicing functional tools. This is the heart of personal reflection.

Each participant designs personal solutions to identified gaps, preparing and practicing before trying to apply. It is here that data drives theory about root cause; is the problem linked to conflict resolution, leadership or a lack of common definitions, etc.? Hence, behavior is changed by altering reasoning patterns based on action data first, rather than, as traditional applications do, by focusing solely on manipulating behavior or forcing patterns into preconceived, theoretical models.

An important role for the Invitationalist during this early phase is pointing out that skill application varies by individual. Some will quickly migrate to use, others more slowly. Skill expansion is directly proportional to the willingness to take risks, make mistakes, build a pool of experience and engage in continuous practice. The Invitationist helps participants stretch their risk-taking and supports when failures occur.

0608_culturechange9STEP 5
With the agreed upon solution in place, the participant, with the required help of the Invitationalist, applies the solution in action and validates the effectiveness. If needed, the Invitationalist may conduct follow-up quality-assurance interviews with staff who were involved in the Collaborative Design application. “Phase 2: Team Co-Creation” (refer to Fig. 3) now can begin.


0608_culturechange10STEP 6
After working on their personal cases, the executive group reconvenes, shares cases, builds its theory of decision making-in-action, validates costs and the value of investing in change and begins to expand the application by digging deeper into the executive teams’ co-created decision making and its associated costs in the moment. With the individual learning under their belts, team members are now ready to expand the application and examine other team co-created decisions. The value is ratcheted up and the functional tools mitigate any risk, so no one is “making a career decision” by pointing out undiscussable or “spin” issues.

0608_culturechange11STEP 7
The executive team validates its collective ability to produce Collaborative Design and the enhanced business value. For example, a vice president and his team discovered they could do strategy building in three hours instead of three days because they came to understand how they confused, argued and spun future scenarios that were only empirically testable, but acted as if their definitions and scenarios were accurate and true. The result had been little or no decisions and/or compromise at best.

0608_culturechange12STEP 8
With gap detected and value confirmed, the executive group identifies and invites the next group of stakeholders to participate in the learning, usually a group or mix of groups that have high potential competitive impact.

Now, it’s on to the final step…

0608_culturechange13STEP 9
The process repeats itself.



Importance of practice
I once observed a seasoned mechanic working on a motor. The first things I noticed were how quickly and assuredly his hands moved; how quickly he used his tools and removed the motor from its mounting brackets; how quickly he broke the bolts, disassembled the motor, diagnosed, found and fixed the problem. He then reassembled the unit just as quickly.

When I marveled at his skill, he looked at me incredulously and remarked, “Good grief, I’ve been practicing for 30 years. Of course, when I started, I always busted my knuckles just like everyone else.”

Learning functional tools is no different, although each individual’s rate of skill acquisition can vary. In addition, as mistakes are made and knuckles are busted, issues of error avoidance, mistakes and looking incompetent will raise their heads over and over again. It never goes away—and there will be substantial pressure to return to the status quo from all quarters. There are some rather predictable stages of learning through which teams typically pass (see Fig. 4). They are:

  • Awareness: Through the Learning Exercise, the team comes to understand how private reasoning shapes the culture and impacts performance.
  • Acceptance: Once identified, the team has to accept the costs to organizational performance and human suffering. Acceptance is an important step in stepping up to a new performance level.
  • Decision: Once the patterns of private reasoning, side-stepping, spin etc. are identified and accepted, the team must make a decision and commit to change.
  • Tool Practice:Measurable change in decision-making is marked by working from internal dialogues and practicing active inquiry through functional tool application. It is not unusual for teams to fail at first; old habits must be let go and replaced by new. This is normal when learning any new skill. The taped data will verify tool application. But, when new skills replace old, the level of performance can increase exponentially.

In summary
Collaborative Design is a new generation of change application. Its vision is to maximize performance while maintaining human dignity. Not surprisingly, there are some predictable stages that learners must go through to achieve a culture-of-reliability and the promise of high performance.

Collaborative Design can be used in any business application, but it is at its best when the stakes are high, either in substance or perception. Like any application built on continuous learning, its results have been encouraging and, as should be, new frontiers are always revealed. Given the fact that it engages human reasoning and the resulting decision-making process, Collaborative Design can be applied in any business setting. MT

Brian Becker is a senior project manager with Reliability Management Group (RMG), a Minneapolis-based consulting firm. With 27 years of business experience, he has been both a consultant and a manager. Becker holds a Harvard doctorate with a management focus. For more information, e-mail:


  1. Survey data is valuable for picking up routine issues, but is unlikely to pick up undiscussable issues because acceptance is tacitly held.
  2. There are various ways to create an action case study.

Continue Reading →


6:00 am
June 1, 2008
Print Friendly

On-Site Infrared Analysis For Lubrication Condition Monitoring

Remote, challenging operations? No problem for this advanced technology. It pays off in real time by going directly to your equipment, anywhere in the world, any time you need it.

Lubricating fluids degrade over time depending on various external and internal influences, including type and age of equipment, ambient temperature and humidity and degree of use and load on equipment, etc. It is well established that monitoring the health of lubricating fluids is an important and necessary part of high-value machinery maintenance. The traditional approach for determining the condition of these vital lubricants is to take a sample, send it off for analysis at a commercial testing lab, then track trends in changes in key lube parameters over time. When these analyses indicate a problem, corrective actions such as refreshing or changing the lubricant are taken.

0608_lubeanalysis1As companies move from preventive maintenance to proactive maintenance, there is increasing interest in onsite lubricant testing because results can be obtained much faster—and they may be more trustworthy. It allows lubrication specialists and maintenance personnel to take decisive action right away. This latter point is important since some of the degradation processes in lubricants occur nonlinearly in time and more quickly than one might expect, which can lead to increased equipment wear or failure. Of course, the ability to use on-site testing equipment is predicated on the ability of the testing equipment manufacturers to make their products straightforward to use and provide valuable information.

A number of analysis methods have made the jump from use by experts off site to routine use by lubrication specialists on site. One technique not making that jump—until now—has been infrared spectroscopy. Infrared has been used for years to evaluate lubricating fluids, but virtually always in off-site commercial labs. Now, though, infrared analysis also is available for use in on-site facilities.

Monitoring critical lubricant parameters
There are several key parameters for which infrared is capable of providing highly accurate information in lubricants including:

  • The level of water present
  • The amount of oxidation and nitration by-products
  • The amount of anti-wear, anti-oxidation and extreme pressure additives remaining

All of these parameters are critical—and some can be measured with other methods. No other technology, however, can provide information on all parameters simultaneously, in less than two minutes. The use of infrared analysis for each parameter will be explored here.

Infrared analysis for water
The amount of water that is present in lubricants is critical to the performance and longevity of the lubricated equipment. Lubricant properties affected by the presence of water include viscosity (measure of the oil’s resistance to flow), specific gravity (density of the oil relative to that of water), and the surface tension (a measure of the stickiness between surface molecules of a liquid). All of these properties are important for the ability of the oil to coat, lubricate and protect the critical mechanical clearances. In addition, the presence of water can accelerate additive depletion and contribute to chemical degradation mechanisms such as oxidation, nitration and varnish formation.

The ability to measure water on-site provides a substantial benefit to ensure accuracy of results. Off-site analysis for trace water may be compromised due to variability of water concentration introduced by storage, transportation or shipment of a sample. Furthermore, some lubricants contain de-emulsifying additives that cause microscopic water droplets to separate concentrate in layers at the bottom and sides of sampling containers. This de-emulsifying action takes time to occur and can cause large variations in analytical measurements. Furthermore, lubricant samples can lose water due to evaporation and loss to the sample container walls. To obtain an accurate picture of the amount of water present, measurement should be made soon after the sample is pulled from the machine.

Analytical determination of water in lubricants typically is carried out using the well-established Karl Fischer (KF) coulometric titration. KF has some practical drawbacks for on-site analysis including complicated sample preparation, the use of hazardous and expensive chemical reagents and length of time required to perform the analysis. With these issues in mind, KF analysis is still considered the “gold standard” method for analyzing water in oil because it provides accurate and precise answers. Under ideal conditions, Karl Fischer has an accuracy of 3-5% for prediction of water in lubricants.

While infrared spectroscopy provides an easy means to measure water, only recently has this technology been able to provide the accuracy and range desired by the lubrication industry. New developments in the ability to use FTIR spectroscopy to carry out customized methods have now made the analysis of low levels of water in lubrication possible, which overcomes earlier technical difficulties. These new methods, coupled with a dedicated on-site infrared analyzer, measure the concentration of water in mineral-based oils with an accuracy and range equivalent to the Karl Fischer method. FTIR allows this measurement to be carried out on a single drop of lubricant, requiring no hazardous or expensive reagents, and it takes significantly less time to complete than KF.

Methods to directly measure water in mineral oils via infrared spectroscopy have been available for over 30 years. For example, the ASTM 2412E method was originally designed for use with motor oil. Routinely containing 1000 to 2000 ppm of water, motor oil has additives that solvate the water into the oil. The methods developed to measure water in these oils by infrared analysis were targeted at large concentration and had correspondingly large errors associated with them. Other lubricants (such as turbine oil) solvate significantly less water—typically it’s 50 to 100 ppm. In these lubricants, higher levels of water form small droplets that eventually settle to the bottom of the turbine oil. If the ASTM 21412 method for water is used for turbine oil, measurement variability of up to 40% on replicate samples is observed.

The primary reason the conventional method for measuring water in oil by FTIR produces a high error in turbine oils is water separation—water separates into small droplets in turbine oil. These small droplets scatter instead of absorb infrared light, and only the light that is absorbed contributes to the measurement of water. Over time, it became clear that a means of stabilizing the water in the oil would be needed to reduce variability.

0608_lubeanalysis2Water stabilization method for infrared analysis
A new method (patent pending) has been developed for the measurement of water in turbine oil. This method, reflected by the data in Table I, uses a surfactant to distribute and stabilize the water in the oil, creating a stable emulsion with uniform water droplet size. Addition of approximately 3% of a premixed non-ionic polyethylene oxide based surfactant blend and gentle mixing effectively stabilizes the water in the lubricant.

Determining degree of oxidation and antioxidant depletion
Oxidation is the most significant cause of lubrication breakdown. It occurs when the hydrocarbon components of the lube combine with oxygen to form a wide range of harmful by-products including ketones, aldehydes and carboxylic acids. Once these compounds form, they in turn combine with other species in the lube and form even more unwanted degradative products. Virtually all of the chemical species that result from oxidative processes can be detected and measured by infrared analysis (Fig. 1). Early detection of these species allows for remediation action to slow down the oxidation process.

The phenolic and aminic antioxidants in lubricants function as preservatives that prevent the oil from oxidizing. Oxidation causes lubricants to quickly lose viscosity and the wetting characteristics that protect metal contact surfaces and prevent wear. Oxidation arises from a combination of sources—including elevated temperatures, extreme pressures, high shear conditions and the presence of water and metal particles—and is accelerated by electrostatic sparking, particularly in certain gas turbine systems. Although antioxidants inhibit the formation of these decomposition products, once the antioxidants are consumed, oxidation accelerates exponentially and at a certain critical point corrective action has negligible benefit. On-site analysis offers a significant benefit in this regard by ensuring that both the antioxidant levels and the amount of oxidation present are known in time for corrective action to be taken before the critical point is reached.

Infrared compared to other oxidation-measuring technology
Infrared analyzers require a drop of neat oil—with no sample preparation. Voltammetric systems require careful pipetting techniques and an extraction step involving an electrolyte solution. The extraction step used in voltammetric systems assumes that all of the antioxidants are extracted from the oil into the electrolyte solution. However, extraction efficiencies are variable for additives in oils. Ranging from 50-90%, these efficiencies may result in 10-50% of additives being left in the oil after extraction, and thus not being measured. Moreover, voltammetric electrodes require maintenance, such as conditioning in buffer solutions. Metal particles, water or organic salts (i.e. ionized carboxyls such as copper carboxylates) will not interfere with the antioxidant measurements using infrared spectroscopy.


Real-time, on-site FTIR analysis offers a number of potential—and important—benefits to lubrication specialists and maintenance personnel. They include the ability to:

  • Analyze lubricants more frequently, especially when previous analyses indicate that machinery needs more careful monitoring… When the performance of lubricating fluid begins to degrade, or if earlier analyses indicate the presence of a mechanical problem, it is important to monitor the lubricant more frequently because the process of deleterious change can accelerate rapidly.
  • Help reduce machinery wear caused by rapid oil breakdown and to detect problems that could cause catastrophic failures… For example, an anti-freeze leak causes excessive levels of water and glycol to be present in engine oil; these levels can be readily detected by FTIR. More frequent monitoring of engine oil by real-time FTIR can quickly catch these contaminants before they have a chance to cause catastrophic damage to an engine.
  • Ascertain the condition of lubricants in remotely deployed equipment, for which the delay in receiving information from off-site labs may be unacceptable… On-site FTIR analysis minimizes the need to send lubrication samples to off-site labs for condition-based monitoring. It is especially important that equipment operating in these remote locations be carefully monitored since ambient conditions may be particularly challenging.
  • Act as the supporting analytical technology in programs designed to bring lubricants back to spec via readditization… FTIR is a powerful method for analysis of anti-wear and anti-oxidation additives. More companies are looking to extend the use of lubricants by refreshing critical additives to bring the lubricant back to spec. Real-time, on-site FTIR can be a powerful tool for determining how much additive should be recharged and for monitoring the overall refreshed oil composition.
  • Enable maintenance personnel to make better decisions on when to send oil samples for full analysis… Real-time FTIR is an excellent screening technology to detect problems with both the lubricating fluid and the lubricated equipment. More frequent screening with FTIR enables personnel to make informed decisions on when to send samples for full elemental analysis, in order to try to pinpoint specific internal machine problems that may indicate excessive mechanical wear.
  • Determine that incoming lubricants are properly formulated, not contaminated in shipping or mislabeled, and that the correct lubrication fluid is charged into the machinery… It is vitally important to use lubricants that meet the equipment manufacturer’s specifications. When special lubricants are ordered and shipped, mistakes can occur in formulation or in delivery. A portable FTIR system can be used right at the loading dock or at the tanker truck delivering fluids, to ensure that the delivery matches the expected formulation.

Infrared spectroscopy provides an immediate snapshot of the overall health of lubricating fluids—it is a window to the vital signs of both the lubricants and the equipment that lubricants protect. Few analytical techniques provide so much information about key parameters that affect lubricating fluid life and engine health. With the new generation of infrared analyzers, the technology can now be used where it is needed, either on site or at site—wherever machinery is in use. That includes some of the most remote and challenging industrial operations on earth. This new approach assists maintenance, service and equipment reliability personnel in making rapid, actionable decisions based on objective analytical data. MT

Frank Higgins is application scientist and John Seelenbinder is product development manager with A2 Technologies. With U.S. headquarters in Danbury, CT, A2 develops, manufactures and markets a comprehensive line of innovative, mobile fluid analysis tools to industries around the globe. E-mail:, and/or

Continue Reading →


6:00 am
June 1, 2008
Print Friendly

Driving Operational Improvements Through Strategic Alignment


Fundamental to success in any organization is getting individuals to work toward common goals. Whether that’s a team of five on the court or a corporation of 50,000 associates scattered across the globe, knowing the goal and working toward unified objectives help every individual contribute. In global manufacturing, however, we frequently see a disconnect in this unified approach.

As global economic trends lead to changes in manufacturing strategies, companies today are realizing that successful financial performance can only be achieved when functional decisions are synchronized and fully aligned with plant or corporate goals and objectives. In rethinking the value and contribution of the manufacturing organization, companies have an opportunity to revitalize their business performance and bring new capabilities to their strategic focus.

A historical disconnect
The front office traditionally has had little direct infl uence on the plant fl oor beyond providing budgets and productivity demands. Conversely, the plant fl oor has little executive visibility, meaning manufacturing considerations are less likely to be taken into account when corporate managers are setting business objectives. In the rare instances when these overarching objectives are communicated to those responsible for the plant fl oor, it’s difficult to reconcile them with plant fl oor deliverables, as the corporate terminology and plant floor metrics rarely converge. This leaves plant managers to set goals and make decisions that risk running counter to the company’s overall objectives as they strive to reach productivity metrics.

The renewed emphasis on effective capital asset management is putting increased pressure on plant managers to contribute to the growth and financial performance of their organizations. The touted benefits of individual initiatives, such as process efficiency and improved quality, mean little if they fail to help plant fl oor personnel understand how they can help address the fundamental corporate goals.

One difference between organizations that succeed and those that fail has to do with the way the manufacturing function is structured, the responsibilities and tactical vision of the plant manager, and the level of integration between plant fl oor decision making and the strategic direction of the enterprise as a whole.

Clearing the hurdles
One of the main obstacles to strategic alignment is the modern global enterprise itself, which is comprised of multiple facilities and widely dispersed geographic locations. On the plant fl oor, localized tactical deployments and siloed functions have led to unique, dedicated systems for manufacturing planning, execution, process control and tracking, oftentimes for each plant. Consequently, the plant fl oor has become the sole focus of the plant manager, where decisions are made primarily to meet production deadlines and efficiencies, rather than with a more holistic view of company objectives.

Central control through large functional departments also can act as a barrier to strategic execution. Executives typically develop strategy at the top and implement it through a centralized command-and-control culture. This system was acceptable 40 to 50 years ago when change was incremental, but is inadequate in today’s dynamic business environment. Rapid changes in technology, competition and regulations mean that strategy development and implementation has to be a continual and participative process.

Coordinated metric development is another fundamental challenge to strategic alignment. For instance, in many companies, there is no visibility to the losses incurred from unnecessary downtime or late deliveries, and no tangible returns attached to manufacturing’s role in meeting quality standards or making on-time deliveries. Consequently, many companies grossly underestimate the overall effect plant fl oor decisions have on the company’s bottom line.

Communications is another hurdle. Organizations today need a language for communicating strategy as well as processes and systems to implement strategy and gain feedback about it. If the strategy does not get translated through the organization to each individual person, then successful execution is at risk. Ultimately, people must have a “line of sight” between their role and the objectives and implementation of the strategy.

Integration at all levels
Much of the progress companies have made toward strategic alignment has been simply the result of better information integration across the enterprise. Tremendous operational efficiencies have been gained by connecting “islands of factories” together into a single integrated manufacturing enterprise. This allows companies to drive operational excellence across and beyond the entire enterprise, including business processes, supply chains and customer networks.

For example, planning long-term shutdowns for capital repairs needs long-term visibility into sales and operations planning. Likewise, the factory supply chain needs to consider and integrate the maintenance function in order to be responsive and proactive. This requires rethinking the way plant fl oor functions are executed, as well as providing support through integrated systems that unify data protocol across plant-wide systems and processes and into executive suites.

This seamless information sharing results in knowledge that improves performance and meets core business objectives. If the plant fl oor understands, for example, that on-time delivery is more important to helping reach corporate customer-satisfaction goals than cost savings, it can add a second shift to help meet those on-time delivery goals.

While most plant fl oor decisions are grounded in the same fundamental vision as the rest of the company, the manufacturing function often operates with different priorities and different reward systems than the rest of the organization. Achieving strategic alignment requires every organizational function to be working toward the same goals. This means strategy must be communicated and then aligned with the personal objectives of individuals throughout the organization—not just at the corporate level.

Just as corporate managers often don’t see eye-toeye with plant managers, the reverse is also true. When communicating the value that manufacturing provides, plant managers need to link results back to the metrics that drive the company’s business, demonstrating how these pertain to management goals and customer demands. For example, how will installing a new condition monitoring system help improve equipment uptime and reduce expenses related to lost production and scrap? More specifically, how does this impact the priceper- product ratio—an underlying management goal? Another example is the incompatibility of purchasing metrics with overall plant management’s capital spending goals. There are instances when purchasing’s focus on lower prices may lead to decisions based on unit cost rather than total installed cost of the system or long-term maintainability.

Naturally, each group pursues business objectives from different perspectives. In many cases, distinct differences in language and methods of communication lead to misinterpretations and a general lack of understanding between the top fl oor and the shop fl oor. Therefore, it’s important that organizations translate the strategy into operational terms.

For example, most companies hinge their success on a simple principle: deliver high quality products at affordable prices. To meet this goal, every facet and supporting element of a company’s manufacturing process needs to be as lean as possible.

By leveraging a plant fl oor strategy that focuses on reducing expenses, improving uptime and optimizing production processes, the company can parlay this philosophy into higher profits in the long-term while gaining a distinct competitive advantage. Without a cohesive understanding of these objectives, however, support personnel might take a short-term view of this approach and cut costs wherever possible, sacrificing the long-term goal for short-term gains. For example, the condition monitoring system mentioned before may provide significant long-term productivity benefits to the factory, but budgetary constraints and performance metrics driven through the purchasing department may lead to a more traditional system. The unit cost would be less but the ongoing benefits would be lost.

In other organizations, the value brought by the plant fl oor may be measured by how it impacts production throughput. Here, the equation is simple: if machines aren’t available, the company can’t produce products and profit opportunities are missed. In this scenario, the entire manufacturing organization takes equal responsibility for uptime, quality and profitability. The goal is to make a certain number of units per day, based on market demand, and do whatever it takes to get it done.

In this situation, the priority of plant fl oor personnel isn’t on preventive activities, but rather on directly supporting production output goals. But, if a plant manager is not briefed on the strategic objectives of the company and how they apply to him, he or she may approach the repair intent on getting the plant up and running as cheaply as possible. If a plant manager knows the company objective involves a long-term approach to productivity and profitability, all the options may be reviewed in order to find the one that meshes best with the company’s goals.

Measurement is key
Finding a way to measure improvements is an important step toward achieving strategic alignment. Every organization measures success by some metric, whether it’s price per unit, earnings per share or total sales. Unfortunately, the metrics used in the front office aren’t always easily converted into day-to-day tactics employed on the plant fl oor or in other internal departments, like marketing or accounting.

Despite changes in the speed of business and the availability of information, the methods for evaluating corporate performance remain largely unchanged. The problem with many of these tools is they offer a siloed approach and fail to capture many of the interdependencies among functional areas and link them to wider business goals.

A multi-dimensional view is necessary because any one performance measure can be managed to the detriment of other measures (i.e., the benefits of reduced inventory can be offset by an increase in overtime and expediting costs). Consequently, it’s imperative that measurements be based on the priorities of the strategic plan and that they provide data about key processes, outputs and results.

The measures should be selected to best represent the factors that lead to improved customer, operational and financial performance. For example, most plant managers are concerned primarily with short-term budgets and productivity. A company that includes sustainability as part of its strategic objective, though, needs to brief its plant manager(s) on that goal so they take these elements into account. Such an approach might encourage investing in energy-efficient drives to reach sustainability metrics.

One technique that has proven effective in helping companies align their business and plant-level strategy is the development of cross-functional scorecards, sometimes referred to as “Balanced Scorecard.” Used for more than a decade as a strategic planning and management system for driving accountability for execution, the Balanced Scorecard creates a system of linked objectives, measures and targets, which collectively describe the strategy of an organization and how that strategy can be achieved. Individual departments can retain their individual priorities yet know their contribution and role in the overall strategic framework.

One advantage of the Balanced Scorecard approach is that it provides a framework that adds strategic non-financial performance measures to traditional financial metrics to give managers a more “balanced” view of organizational performance. To provide detailed strategy at the corporate as well as plant level, companies can build scorecards for all business units and key support functions. When implemented successfully, it offers a truly bottom-up approach, supplying managers with feedback around both the internal business processes and external outcomes in order to continuously improve strategic performance and results.

Widening the accountability
Nothing kills a strategy faster than under-committing resources. Thus, it’s critical that managers understand the financial commitments that are required to implement a plan and provide the necessary support once the plan is approved. While there are no easy choices or silver bullets here, the foundation for strategic alignment is one that takes a disciplined approach, includes well-defined, balanced objectives and drives accountability and transparency for the decisions and actions that are made.

With today’s advances in technology, companies now can fine-tune almost every phase of production for maximum yield, quality and profit. Still, technology is only part of the equation. The ability to align business strategy across the organization is the missing link. While a unified business strategy isn’t going to solve every problem, it does widen the accountability for financial performance from the top fl oor to the plant fl oor. This is one trend that most certainly will pay dividends in today’s highly competitive manufacturing market. MT

Bob Ruff is senior vice president of Control Products & Solutions, Rockwell Automation.

Continue Reading →