Too many accidents plague our industry. We read about them and grieve for those who lost their lives or were injured, but we frequently write it off to the mistaken belief that "someone made a mistake" or "someone had bad luck." We seldom look deep and act in a systematic way. If a report indicates that the accident was caused by bad rigging, we check our rigging and think that we have done all that is needed.
It is not enough. Accidents are symptoms of problems deep in an organization and arise because we permit them to occur. A sling does not wear the first time it is incorrectly used. It wears over time. We see the red flags but frequently take no action.
There have been great improvements in construction safety over the past 15 to 20 years, and equipment managers have played an important part in the process. Everyone is actively engaged in safety training, but most of the effort is reactive and focuses on regulations, awareness and protection. We make sure that we know how to conform and comply; that we know the difference between an unsafe act and an unsafe condition; and that we use hats, gloves, glasses and handrails. Yet, accidents continue to occur. Cranes overturn when we know how to use load charts. Slips and falls occur despite the fact that all our operators know how to get on or get off machines safely. We must do more.
Reactive training in regulation, awareness and protection is important and cannot be undervalued. It must, however, be complemented by proactive leadership that insists on planning and executing safe operations and develops an organizational culture that places safety above all else.
It all starts with knowing and understanding the nature of the work that we do. In his book "Normal Accidents," Charles Perrow contends that we live and work with high-risk systems that fail because they are complex and because their various components are tightly coupled. Erecting a crane is certainly a complex operation and the required sequence of steps is closely coupled: Failure in one easily cascades to precipitate a total collapse. The question therefore arises: Can we make the work safer by reducing complexity and reducing the coupling between the required steps?
Perrow gives some guidance. We can simplify or eliminate high-risk operations even if it costs more time and money. We can de-couple operations by allowing more space, time and flexibility between steps and by ensuring that there is a fail-safe mode built into each step of the process. We can design machines to be fueled from the ground, which simplifies fueling and eliminates the slips and falls that occur when service technicians climb onto machines. We can inflate tires behind an appropriate screen to protect our technicians from hazards..
Perrow's theory is depressing and almost fatalistic. Although we can reduce complexity and coupling, practical or economic limits often stop us from reaching our goal.
Some organizations, however, have superior safety records. They perform complex and tightly coupled operations with few, if any, failures. Civil aviation is a good example. In his book "Limits of Safety," Scott Sagan proposes that an exceptional safety record is due to organizational culture. He defines the characteristics of "High Reliability Organizations," with four factors that are necessary to ensure superior safety.
"The first and most obvious requirement for high-reliability organizations is that extreme reliability and safety must be held as a priority objective by the leaders and heads of the organization," Sagan says. We know this to be true, but it merits the credibility of the research Segan presents.
Leadership priority and commitment ensures three things. First, it sets the example and establishes expectations. If leaders do it, then everyone else can be expected to do the same. Second, it ensures that resources are available to support the safety effort. No one can deny that investments are necessary to ensure safe operations, and no one can deny that it makes good business sense. Leaders must define, measure and promote the business case for safety. Third, leadership priority and commitment are needed to develop the clear and well thought out processes and procedures required to perform safe work. Performing a job-hazard analysis and focusing on safety in an operations pre-plan are often second to the natural focus on time, cost and quality. Leadership and management commitment will ensure that safety comes first.
Sagan also addresses the fact that "human beings are not perfectly rational machines" and that it is extremely difficult to build "reliable systems from unreliable parts." We constantly struggle with problems caused by human error. High-reliability theorists propose that the solution lies in redundancy. Checks and balances ensure that safety-critical activities are performed, monitored and checked by different people so that discrepancies are noticed by "someone" before they cause problems. A shortage of personnel, a push to complete work quickly or on a tight budget frequently cause us to place too much reliance on too few people doing too much work. Under these conditions, failure is almost inevitable. Our work plans and our organization must include the redundancies and checks needed to ensure safety despite our own shortcomings.
The third area that Sagan identifies fits well with the way in which construction companies are organized: High-reliability organizations require a strong organizational culture that supports decentralization of authority and continuous training. Field mechanics work independently away from immediate oversight and supervision. Safe work therefore depends on the degree to which they personally believe it is important, on the decisions they take on their own in the field, and on the training they have received. Management can insist on safe, well-planned, simple fail-safe operations, but they cannot oversee and inspect every operation at every location. High-reliability organizations develop an intense personal safety culture and decentralize decisions confident that their culture will ensure appropriate action. They condone no exceptions.
The last area Sagan identifies as contributing to high degrees of safety is organizational learning. Successful companies learn from their mistakes and adjust their procedures over time. They develop new processes to accommodate change, maintain processes that work, and eliminate those that have not been successful. It is not easy to learn from the mistakes that lead to accidents. Causes are often unclear, events are frequently reconstructed in a way that supports preconceived ideas, and there seems to be more of a need to apportion blame than find cause. Effective learning avoids these pitfalls, addresses the facts no matter how uncomfortable, and shares experiences across the organization as a whole. Lessons learned are implemented aggressively, and the organization moves forward with more strength and more knowledge.
Safety is a prerequisite for good management. It is not negotiable, and there can be no compromise when it comes to conducting our operations in the safest way possible. We deal with complex, tightly coupled systems; accidents will occur if we do not commit to a high-reliability mindset. If we understand this, and if we understand that our organization, culture and leadership lie at the root cause of each and every accident, then everyone will go home safe every day.
|Successful Safety Attitudes|
|Normal Accident Theory||High Reliability Theory|
|These attitudes of safety were developed from Scott D. Sagan's book, "The Limits of Safety."|
|Accidents are inevitable in complex closely coupled systems.||Accidents can be prevented through good organization design and management.|
|Safety is one of a number of competing objectives.||Safety is the priority organizational objective.|
|Redundancy often increases complexity and encourages risk taking.||Redundancy enhances safety. Checks and balances produce reliable systems out of unreliable parts.|
|Centralization is needed to manage complex, tightly coupled systems.||Culture, decentralized decision makingand training cause the right personalaction in the field.|
|Denial of responsibility, faulty reporting, and reconstruction of history cripples learning.||Organizations learn from their mistakes by identifying the cause, addressing the facts and sharing lessons.|