David Chappell, Chief Strategy Officer for TripStax, believes it’s time for a new approach to quality control because current systems aren’t up to the task
When I was six years old I watched my father attempt to wash coffee from a duvet. Not a man much acquainted with laundry, it was a bizarre sight to see this goliath, as he was to me then, shoving some five square metres of 22-tog bedding into our tiny Hotpoint washing machine. I recall him physically kicking it into place, black boot on white cotton, until the final corner was eventually ensconced in the bulging drum. Once the cycle had started, I sat there on the kitchen floor before the straining machine and watched, bemused, as it rattled and danced in its spot.
It is an enduring image from my childhood and one that for many years remained just that. And yet now, I am reminded of it almost every day as I look at how we, as an industry, gamefully attempt to shoehorn processes into systems that simply weren’t designed to accommodate them. That is not to say that they are not functional but they, like the 4kg drum of that embattled brown and beige machine of my father’s, are both limited, and limiting.
“As the industry continues to disaggregate apace, it is folly to continue to kick the proverbial data duvet into place”
Quality Control [QC] is a prime example. It should come as no surprise to anyone remotely acquainted with agency operations that customers of TMCs expect a certain level of booking competence and data integrity from their partners. It is therefore, rightly, one of the biggest parts of the offering of many TMCs – that in every instance, on every booking, fares will be checked, rules applied, data collected and service levels met.
And yet to do this, worldwide, we still rely on systems that only process data residing entirely within the GDS. Which means that if an agent wishes to QC every booking made for a customer, even those made outside of the GDS must be shoehorned back into the plain text world of PNRs, segments and remark lines. This has to change.
As the industry continues to disaggregate apace, it is folly to continue to kick the proverbial data duvet into place. To attempt to translate every booking, regardless of where it is made, into this machine is to perpetuate a nonsense that only makes sense because the workflows are anachronistic.
Please do not misunderstand me, I am not anti-GDS. They are immensely efficient distributors. But in a world where bookings are now created in a multitude of systems, to then rely on a system that can only process PNRs is no longer scalable.
QC processes and those associated with them, such as document production, need to be centralised outside of the GDS, on a more modern, API-first connected landscape. Booking data can then be received from any provider, in its original format, be processed, and then pushed to whichever system or workflow is required next, without significant translation or data loss. The possibilities for automation and subsequent efficiencies are myriad. Having a separate QC layer is the future.
“In a world where bookings are now created in a multitude of systems, to then rely on a system that can only process PNRs is no longer scalable”
In doing this, agencies free themselves from the chains of having to put everything into a PNR, the so-called ‘GDS centricity’. This is turn allows them to be more more creative on the kinds of processes that can be put in place for any given customer. It also affords them commercial agility. Bringing in or changing a supplier need not be a challenge when the primary forms of check and balance are already defined, waiting simply for their connection.
When my father returned and dragged the sodden duvet out of the wheezing machine, it is fair to say that the coffee had not so much been cleaned but rather distributed about the bedding. For all the effort involved, the machine simply wasn’t up to the task. He was disappointed with the output. I wonder whether there are agencies out there looking at their own systems and thinking the same.