With the snowstorms hitting the East Coast hard this year, lots of travelers are stranded at various cities away from home when their connecting flights are cancelled,. In recent years, the airlines have changed the way they re-book passengers on alternate flights.
Here's what happens today: the passengers get a notification from the airline, maybe before or while they are getting off the plane, that their connecting flight was cancelled due to weather. Don't worry, the notice proclaims, the airline has taken care of their rebooking needs proactively. What follows is their revised itinerary.
The process used to be different: stranded passengers formed a mob around the gate agents who patiently worked through the options to identify their most desired alternate flights. Seasoned travelers scrambled to reach the gate agents as quickly as possible because the best options (e.g. direct flights) would be snapped up in a second.
***
The new method is more civilized. It prevents the mob scene, and the protracted negotiations with the agents to find the least inconvenient option.
The new method shifts the task of re-booking from the agents to a software program running an algorithm. This solution has major advantages from an engineering perspective. All passengers are treated as pawns that can be moved around a chessboard. The software has the bird's-eye view of everybody's destinations, the future schedule of flights, the number and types of seats available on each flight, and so on.
The algorithm finds the best rebooking assignments at the system level, instead of the individual level. In the old world of first-come, first-served, agents made "greedy" moves, satisfying the current passenger, possibly at the expense of others still in queue. For example, after letting the current passenger onto a specific flight, an agent might later have to split up a family.
In the new method, passengers are not given a chance to fight for better options. They are coerced to accept the outcomes of the algorithm. The software optimizes for the “average customer.”
***
So is the new method better? It's not as clear-cut as the engineers would like us to think.
Accountability
There is one myth that both engineers and consumers share - the reification of software. Software (read: AI) is treated as a person with the ability to think for itself. In reality, the decisions did not pass from human agents to this AI “person”; the AI is an algorithm designed by software engineers to behave in a certain way. So the algorithm designers have effectively taken over the job of re-booking passengers.
But if you ask the algorithm developers, they would deny that they accepted that role. They live far from the scene of the chaos. At the airport, passengers could not complain to these engineers. They would be lucky if they found a gate agent to whom they could plead their case.
The new method is opaque. It feels like coercion. A friend recently told me that the notification even disclosed that passengers might end up in a fare class lower than the one they paid for – without possibility of getting compensated. One problem with opaque systems is that the feedback mechanism that is crucial for ongoing improvement is curtailed.
Whose interests
A software does not necessarily make better decisions than human agents. A real concern is what is being optimized? There are many possible "objective functions" for this problem, such as:
- maximizing profit for the airlines
- minimizing inconvenience for passengers, e.g. by minimizing average delay to final destination
- avoiding fines, or compensation for inconvenience caused
- satisfying a segment of the passengers that the airlines deem more important, such as those who pay more for the ticket, veterans, etc.
Further, an algorithm can be tuned to work well on average; or it can be made to create the most good for the most passengers; or any number of configurations.
The software explicitly or implicitly settles all these issues. But the opaqueness discussed above ensures that most if not all personnel involved have only a weak understanding of whose interests are being served. The airlines loudly proclaim that they are doing what’s best for passengers. But it is naïve to think that what’s best for passengers always align with what’s best for the airline’s business results. More often than not, those two objectives are in conflict.
Just take for example, the airline wants to avoid paying passengers any kind of compensation for the inconvenience – and actual cost – of an unplanned stay at the layover cities. Passengers prefer that airlines cover the unplanned costs. In the past, passengers could plead with the gate agents. That is no longer true when the passengers are sent one-way instructions from a remote computer.
Special Needs
A system-level optimization basically serves the "average user" well. Anyone with special needs will suffer. For example, if you are rushing to go to a funeral, the software will not know it; and even if it knows it, the engineer may not have anticipated the needs of funeral-goers. In the old world, the human agent might succeed in working out a solution, perhaps with the assistance of other passengers.
Fairness
The system is fair only to the extent that one understands and agrees to the trade-offs implicitly coded into the algorithm. Since those design decisions are almost never made public, customers have no way of knowing if they are fairly treated. This creates an environment in which unfair treatment can be covered up.
***
From a mathematical perspective, the new way of rebooking passengers through a behind-the-scene algorithm is more orderly and optimal from a system-level perspective. However, this automated approach treats passengers as cattle, and can cause real pain because some passengers are bound to have individual needs. Also, the software approach lets businesses hide behind opaque processes, and could be exploited by those demanding special privileges.
Recent Comments