Zuzanna Kosowska-Stamirowska writes how data acquisition and algorithm design need to be defined - or redefined - for the existing market players to make the AI revolution work.

The logistics industry is at the brink of an artificial intelligence (AI) revolution. This sounds like a bold statement for an industry which is currently taking its first steps into digitisation, with close to half of air freight still being processed through paper waybills, shipping agents still communicating with carriers by phone and email, data entered into systems by operators with hours or days of delay, and serious operation planning often still being done in a spreadsheet. But the AI revolution is coming to logistics quickly, perhaps much sooner than expected - and if it does not come from within the industry, then it is likely to come from outside.

There are other similarly complex industries, such as the energy market, which also include network-based operations with separated supply and demand, and which are already deep into the AI revolution. So, we should rather ask: why is AI only reaching the logistics industry now? We can hazard the obvious answer: because, up to now, this was a traditional industry, with operation planning tuned to the capacity of human operators, and there was simply not enough external pressure to drive an external AI revolution. Until recently, the large players of the New Economy who might have had a skin in the game, Amazon in “the West”, and Alibaba group in “the East”, relied on more established logistics actors to grow and scale their e-commerce and distribution operations. This resulted in a complicated provider-client relationship between the incumbents and the new players, with little will to enter into direct competition. The lack of appropriate AI methods for the challenges in logistics didn’t help to speed up the process.

In the B2C realm, a new standard of service and of expectations has been irrevocably set by the likes of Amazon and Uber, featuring: predictable waiting times for delivery, high service availability, online quotes, and improved visibility of the delivery process. Indeed, all of these aspects are just as pertinent in supply chain planning as they are in consumer operations. It is equally clear that the right answers come through a combination of demand-driven planning, e-platforms, and AI-powered predictive analytics.

The only question is: who will bring the new approach into the industry? Will the current air and sea carriers be those best equipped to use data to optimise and tune their operations? Will it be freight forwarders who take advantage of their existing B2B client relationships to bring them into a new digital era? Or will it be Amazon, whose 230+ billion USD revenue (2018) already tops the combined revenue of all of the major world sea freight companies, who will take a shot at redefining the logistics industry its own way? This question is becoming ever more real. A few years back, we could muse futuristically who will be launching the first commercial drone-based delivery service, Amazon, Uber, or DB Schenker? Today, we may instead ask ourselves if our next year’s Christmas gift will be flying to us with FedEx from Memphis, with Amazon Air from their hub in Kentucky, or perhaps with "Alibaba Air" from their new hub in Liege. The new players in the field have a clear advantage in terms of their potential for supply chain consolidation and data-driven operations planning. But, the market is not theirs yet - and the next move in the game is with the established actors in the logistics playing field.

For any supply chain actor launching into the AI revolution, the key building blocks to take into account are data acquisition and algorithm design. Both of these elements need to be defined - or redefined - for the existing market players to make the AI revolution work. Every case calls for a different, tailor-made process. Nonetheless, there are some sure ingredients of the right AI approach which repeat as a leitmotif.

First, it always pays to “act globally”. This is a guideline which applies at all stages of an AI rollout, to favour global data processing on the entire logistics network over pointwise operation planning. From a pure business angle, the costs of deploying an AI platform do not depend much on the scale of operations, while the benefits are incomparably higher at a larger scale. But in fact, there is also a more profound engineering explanation for such an approach. Indeed, in AI methods, performing global forecasting of demand and resource planning is simply much easier to achieve than local. New forecasting methods succeed in predicting demand for a service (such as air freight) at a hub almost perfectly if given enough data to churn - from multiple hubs, not just the single hub concerned. In these methods, machines both significantly exceed human performance, and process more data than a human operator could ever be reasonably expected to process or even digest.

The second ingredient is to always “rely on your own data”, while working with other actors. This is not to say that current standardisation efforts or industry cooperation in terms of data collection and sharing are not crucial for the sector. Nevertheless, collaborative efforts may come too late, and the stake in the game is too high to lose due to inertia in the sector.

In terms of data acquisition, it seems likewise prudent not to rely too much on local data integration efforts, which simply take too much manpower to put into place. It is advisable to squeeze as much value as possible from the data which is already available and move quickly on the market. As an illustrative example, the benchmark approach can be the way Google Maps addresses trip planning where all the crucial traffic data is transmitted automatically by GPS sensors of users’ mobile devices. Incidentally, for freight logistics as well, predictive intelligence based on IoT sensor data may be the path to follow for data acquisition. And, just as for Google Maps, sensor data acquired in such a solution benefits all clients and users of the service, providing them with accurate forecasts on demand, congestion, and delays. Not every shipment needs to have a sensor on it: there exists an optimal level of coverage of shipments by IoT devices, sufficient to ensure supply chain predictability. Guiding IoT investment, and subsequent steps such as IoT data segmentation and analysis, come as a natural part of the move towards intelligent logistics platforms.

All in all, it is up to every supply chain actor to take intelligent decisions in the face of uncertainty, to ensure robust, on-time operations, and minimise the effects of disruptions - a key point to ensure a robust supply chain for all.

We created NavAlgo with the mission of helping key players in the logistics industry to embark swiftly on the AI revolution. Our objective is to assist our clients to adapt their service offer and their data-driven operations planning to meet the needs of the logistics market of the future. We combine expertise and viewpoints of active researchers in AI and forecasting, recruited from research institutes such as Google Brain, industry practitioners for real-world logistics operations, and experts in business process and pricing design. We take a holistic approach to problems in logistics, linking forecasting with dynamic optimisation in conditions of uncertainty. In this way, we put forward a service which provides the most value to the end-client, with operations adapting to meet her expectations in an efficient way. This means meeting demand where it appears and providing enhanced supply chain visibility and predictability.


The writer is the CEO of NavAlgo, an information technology company that build integrated solutions for intelligent resource allocation in logistics and creates cutting-edge algorithms to navigate in a sea of data.

This story was originally published in Logistics Update Africa's September - October 2019 issue.