By Adam Watts, Imagine Technologies Group.
Drones are quickly growing more popular within the roofing industry, with drone inspections becoming more and more commonplace. When used correctly, drones are incredibly beneficial to roofers, offering safer, faster and more fiscally sound ways to do roof inspections. A pilot’s skill level is a crucial part of a safe drone flight, but this aspect is only as good as your business’s approach to training and supervision.
It goes without saying that aviation in any form can be a dangerous business. Of course, the level of danger varies whether you are flying an airliner, fighter jet, attack helicopter, or using small commercial drones for inspections, but several common factors will contribute to reducing the risk of a drone flight.
The FAA provides extensive guidelines for commercial drone safety, but simply following these minimum guidelines isn’t enough to ensure safe operations. While a pilot’s skill level is very important to safe drone flight operations, it’s not as critical as the organization’s approach to training, supervision and, most importantly, creating a culture of learning from mistakes in a constructive manner.
A few years ago, I witnessed first-hand how organizational mismanagement can lead to a drone accident. I was talking to a drone pilot who worked for a local drone service provider (DSP) who had been tasked at short notice at the end of a long day to capture some imagery of a self-support tower. Crucially, when travelling between tasks, he had failed to charge his drone remote control and started his flights with only one-third of his battery power remaining.
He began collecting the required inspection imagery but started to feel time pressure as the evening twilight progressed. His final serial required him to collect drone imagery on a sector on the opposite side of the tower. The first battery alarm was triggered on his iPad as his remote hit 10% power, and then the drone itself reached 20% battery, triggering a second alarm.
Unfortunately, he assumed both were informing him of the drone battery, not the remote. Assessing he had enough power to capture the remaining images, he pressed on. Approximately two minutes later, his remote shut down, and the drone began its pre-programmed Return-To-Home (RTH) Mission. The drone had been programmed to RTH at its current altitude, so turned directly towards the take-off position.
Because it was on the opposite side of the tower to the pilot, it began to track home, and towards the structure. In a state of panic, the pilot could only watch as his drone got within inches of the cell tower before its obstacle detection recognized the self-support framing. The drone initiated a maneuver and climbed towards a beam that now lay in an obstacle avoidance blind spot.
The drone’s props contacted the beam, and it immediately began an uncontrolled tumbling descent from approximately 75 feet. Fortunately, it impacted on open ground, but was very close to coming down on top of the ancillary equipment in the tower compound itself.
What lessons can be distilled from such a drone safety incident? It would be easy to rule that the cause of this drone accident was solely due to drone pilot error or incompetence. If this was our only conclusion, we would be ignoring a host of other issues, and more importantly, setting the organization up to repeat the failure in the future.
The example shows that significant change must occur in the organization if it is to build a safety-focused program for managing drones for inspections. But it is not enough to create rules, impose processes, and run training programs.
The biggest contributor to a long-term drone safety program is a cultural shift in attitude to mistakes, where an organization’s immediate reaction to an incident is not to look for blame, but to seek to learn the lessons and better prepare the team, so it does not occur again. This will require an open and honest system of incident reporting, where those who make mistakes will be supported, not punished, when they come forward after a safety occurrence.
This “Just” culture should result in lessons identified that improve pilot training and standard operating procedures (SOPs), so they are constantly evolving to reflect best practices and prevent repeating safety incidents, occurrences, and accidents. In most modern air forces, accident investigators would attempt to break crashes down into causal factors, with an emphasis on forensically identifying the contributing factors and what organizational changes could be made to stop a future occurrence.
So, let’s try to break this example down to the three main failings:
1 - Pilot training and knowledge deficiencies. In this situation, the pilot was P107 qualified, but with formal training restricted to the technical aspects of capturing drone imagery. He had not been formally trained on drone RTH logic, nor had he been trained on procedures to correctly program safe, mission appropriate RTH settings. He also had not been trained on the basics of how obstructions can mask the control link signal from the remote to the drone, or the circumstances in which RTH can be initiated by remote control failures.
2 - Supervision culture – SOPS and flight planning. The pilot was dispatched on this drone inspection without a plan that been approved by their company management, and management was not present during the flight. In short, execution was delegated entirely to the pilot, with no direct supervision. He was neither provided with cell tower specific checklists prior to the drone flight, nor did he have any formal SOPs to provide a framework for how to conduct flight profiles when operating drones for inspections.
It is worth emphasizing that supervision does not need to be physically present. It can be done remotely if the correct procedures are in place. It can also be done through company-approved SOPs. Supervision should also cover important human factors such as the crew rest periods, and whether they are too fatigued to fly. It also covers the environment in which the drone pilot is operating. Will they be comfortable enough to operate drones for inspections safely in extreme temperatures, do they have enough light, etc.?
3 - Supervision culture – time & task pressure. It would be fair to say that the pilot put himself under pressure to achieve the task, and he should have recognized he was beginning the flight in an unsafe condition; with batteries in a poor state and insufficient for the task. That being said, the management was perfectly aware of the flight profile that the drone inspection task requirement dictated, and that the task had been added to the end of a day after a full schedule.
It was also unusual in that the flight profile was around a vertical structure, which itself carries many more additional risks than any other inspection. A good supervisor would have recognized this and directed him to execute the flight the next day, with a fresh set of batteries, a clear head, and minimal pressures.
Task pressures can often override immediate safety concerns. This is true in all walks of life, but in aviation disciplines it is especially critical to maintain a strong culture of safety. Professional supervision could have mitigated the safety incident in this case.
There is a well-worn phrase in the Royal Air Force flight safety culture that if you put enough blocks of Swiss cheese together, eventually the holes will line up and something could pass straight through. It is a cumbersome analogy, but it does illustrate the point that accidents occur when the gaps in an organization’s safety management policy line up.
Drones for inspections are not inherently prone to accidents, but the self-support tower crash occurred because the pilot was deployed to complete a task with inadequate training, with no SOPs, and with poor supervision. Management at the DSP pressed the pilot to fly the cell tower with poor appreciation of the risks the drone inspection task entailed. The holes were numerous, and the outcome inevitable – whether today, or tomorrow. Accidents are going to happen if this culture continues unchecked.
Drones will continue to be a great tool for your organization to reduce costs and risk if well supported by an open and honest culture of continuous improvement. If this culture already exists broadly within your organization, and the effort is made to extend that culture to drones for inspections, you will experience increasingly safe operations through a “just” safety culture supported by training, professional supervision, documented standard operating procedures and shared learnings.
Learn more about the Imagine Technologies Group in their RoofersCoffeeShop® Directories or visit www.imaginetechnologiesgroup.com.
Original article source: Imagine Technologies Group
Comments
Leave a Reply
Have an account? Login to leave a comment!
Sign In