The other day I came across an interesting blog post on the future of data exchange. Written by Scott Sangster, the VP of Global Logistics at Descartes he looks at the question of what information will be required to be exchanged in the future. A key question he highlights, is who should be driving the future? Should IATA and Cargo 2000 be dictating the terms of reference to lead the way in terms of messaging, or should it be the tech companies?
For Sangster it’s not an either/or situation, but rather he sees both IATA and Cargo 2000 playing crucial roles in establishing procedures and messaging formats and in helping set the standard for the industry. Tech companies, he says, on the other hand have a significant amount of knowledge that can help set a “realistic course for standards development and implementation” for the air cargo and other industries.” By way of example be notes, “there has been some additional involvement from technology companies as the XML standards were developed and in my opinion, this needs to expand and continue,” he says.
When it comes to development of new standard, programmes and processes, he highlights that as these are created “it’s the job of all parties involved to ensure that they are accessible and affordable for the entire industry in order to encourage adoption.”
Indeed this could well be turned into a cautionary tale if one looks to the e-freight example and reads between the lines. In the much troubled e-freight saga certainly this is one big criticism – that the IT providers were not fundamentally enough involved from the very beginning. That complaint is mostly muttered on the sidelines of events like the World Cargo Summit, but a voice that has otherwise remained relatively mute, perhaps not wanting to bite the hand that will eventually feed it. To be fair, the process was radically overhauled and made far more inclusive once the environment changed at IATA back a few years ago and greater industry cohesiveness was forged in the form of GACAG, for instance. The problem was, it was just very late in the game. In his blog Sangster notes that e-freight’s principals, potential from data collection and flexibility of dissemination of this data using new technologies holds great potential for the industry. But there is a qualification: “As long as the programme remains flexible and inclusive of all participants from a technology and cost perspective it lays the foundation for the collaborative data management world that is on our doorstep.”
As the new Cargo-XML schema becomes widespread, says Sangster, the flexibility to provide enhanced data and additional information will allow for easier and more affordable methods of implementing data exchanges to meet the market’s requirements. These requirements are likely to be far more sophisticated and complicated than today, driven not just by commercial (in part driven by the increasing range of options technology will create), but regulatory needs as a result of expanding cargo security requirements.
More information, from more parties, further back in the supply chain and earlier in the process are likely to be hallmarks of future data needs, when it comes to security. And this, calls for an “industry-wide obligation to have more collaborative multi-party business processes and the technology that supports it across air freight carriers and anyone participating in the air cargo ecosystem,” Sangster says.
It may seem premature to speak of a post e-freight environment, especially when only six per cent of all global shipments are using what has become a latent first step in the e-freight process – the e-AWB – against IATA goals of 20 per cent by end of this year and 50 per cent next year, but it is precisely because of this, that it is the right time to remind the entire industry that the next leap in the digital realm needs a more inclusive process from the very beginning.