How I won a yacht race in a submarine and what it taught me about decision making

|Matt Offord

How to cheat at competitive sailing

It was 4am on the freezing cold bridge of a submarine.  I was the Officer of the Watch which meant that I had control of the vessel as she ploughed through the frigid waters off the West coast of Scotland, moving on the surface.  We were moving at slow speed as we were ahead of schedule for our return to our home base after several months at sea.  The look-out and I were keeping a weather eye on the multitude of yachts around us.  There was, apparently, a night race on.  Surrounded by tiny yachts in a slow moving, 16,000 ton submarine was not a situation I had encountered before despite years of experience at sea.  Our situation forced us to go slowly and we therefore couldn’t just pass the yachts and leave them safely behind.  Because of our relative sizes a collision with a yacht would be extremely dangerous for the sailing vessel.  Yachts are not easily detected by radar so information was scarce.  They were keeping us pretty busy, raising our binoculars to the distinctive navigation lights, assessing the vessels course, speed and closest point of approach.  Then moving onto the next vessel and so on.  Because yachts can change direction quickly and often need to, there was no rest for us.  But something kept catching my eye.  A shore light, some distance away kept blinking on and off.  Flashing lights are there for a reason, to tell the mariner something.  Like warning one of a danger, for example.  Such lights are marked on the chart and on radar displays.  But the team in control room assured me there was no navigation light in that direction.  Lights in houses don’t usually flick on and off and vehicles are easy to spot as they move very quickly.  This was a fixed light, not a navigation light, but it behaved like one; blinking on and off.  I was curious but had no answer. I continued my round-robin of assessing the yachts around us, but I was not happy about the curious light.  It kept nagging me, and I kept going back to it with my binoculars.  Something was not right.  Eventually, I was happy with the yacht situation.  All the yachts that were going to come within a certain distance had been reported to the Captain.  We were going to be out of the area in an hour or so.  I kept going back to the weird light.  Then I realised something else, the reflection of the light on the water was strange, it did not reach out as far as other reflections.  And the light wasn’t blinking.  It was more like it was being covered and uncovered.

Suddenly realisation dawned, I called into the internal communications:

“Full Astern!”

There were no questions, the order was acknowledged.  Almost immediately the wake of the submarine began to churn as the direction of the propulsion was reversed.  It took a lot longer for the speed of the submarine to reduce; 16,000 tons is a lot of momentum.  Full astern is an emergency order, only to be used when absolutely necessary. This very short order contains an unspoken but very clear message: “Go astern as quickly as you physically can, I accept that people may be hurt or (more likely) machinery may be damaged in the process”.

Considering that the bridge of the submarine is about 30 feet (via a vertical ladder) above the access hatch into the pressure hull, where the rest of the crew works and everybody lives; considering that the Captain’s cabin is a deck below that hatch, I was astonished how quickly the Captain arrived fully dressed on the bridge.  His eyes were everywhere, frantically trying to assess the situation as they tried to focus in the dark.  I pointed to the port bow, some 100 yards away, blissfully unaware, a tiny yacht pitched up and down on the waves.  Its white hull was barely visible and it had no navigation lights on.  The only clue to its existence (until it was really close) was the strange way its sails occluded, then revealed, the nearby shore light as she rolled in the waves.  By now the submarine was slowing down and the yacht sauntered in front of us, just yards from the bow.  The Captain exhaled in relief, it was a good call.  By now many of the yachts, we had threaded through were beginning to catch us up.  The Captain agreed that we should speed up and get out of there.  Thanks to the advantage of nuclear power, we easily won that yacht race and left those sailors, one of whom was completely ignorant of how close he or she had come to collision.

Data and decision making

I do not hang around the freezing cold bridges of nuclear submarines anymore.  However, it was a salutary lesson, which I still draw on to this day.  As a management consultant working in complex environments, I have noticed that the Information Age has brought a number of special problems not unlike those I faced with the yacht race.  Firstly, information overload is a very real problem.  We spent a huge amount of time assessing and re-assessing a very busy situation.  Arguably, if the radar had been able to detect all the yachts, there would have been more information to sift through.  A small anomaly such as a blinking light could have been missed.  This is because there are essentially two types of error.  A Type I error is a false positive, asserting an explanation which is actually false whereas a Type II error is a false negative, that is failing to assert an explanation which is true.  These are also called errors of commission and omission respectively.  Had I not spotted the yacht with no lights, I would have made an error of omission.  I would have settled with a narrative which was easy to believe: all the yachts were lit and could be seen.  Another way to look at errors of omission is that they occur when the data which helps us truly understand a situation is absent or obscure.  This is the significance of data which is not there.  The most cited example of this phenomenon is Abraham Wald, a mathematician serving in the Statistical Research Group in the US during WWII.  He noted that aircraft returning from operational missions had a greater number of bullet holes on the fuselage.  The US military therefore were minded to armour this area alone (too much armour makes the aircraft too heavy).  But Wald pointed out that the aircraft which failed to return very likely had a different spread of bullets.  The aircraft which had been hit in the fuselage had made it back anyway, they did not need to armour this area but the rest of the plane, especially the vulnerable engine.

Ultimately, improbable as it sounds, data science and leadership go hand in hand.  Leaders need to make decisions and decisions need good data and the proper approach.  I have recently developed a new leadership model called Start With Another Narrative (SWAN).  The philosophy is to question what you see and challenge the easy explanation.  This is fairly simple but the execution can be very complicated.  Abraham Wald’s explanation of the bullet holes is straightforward but it took a maths genius to see it.  SWAN can incorporate any of the 50 or so Business Analysis tools, or advanced statistics and simulation.  But it can also just mean developing one’s critical thinking using simple tools.  For now, let’s focus on the three lessons from the bridge of a submarine:

  • Avoid becoming data-bound, that means prioritising what information (or the lack of it) is more important
  • Be aware of errors of commission and omission (see Matthew Syed’s “Black Box Thinking”)
  • Look out for missing or obscure data

BOOK  A TALK ABOUT SWAN

Presentations, pre/post-dinner talks about leadership in the Information Age


Contact us

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.