AI in Transportation
a zippy ride full of legal landmines
I’m in the line at the departure terminal of the airport, of the commercial (not business jet) kind.
A large family with young kids in tow insist loudly on getting human assistance instead of getting their tickets and baggage checked in at an electronic terminal. The year is 2023 and we are at Changi Airport in Singapore where automation has slowly crept into airport operations. Automation has taken place, but some consumers continue to demand human attention.
This is merely a foretaste of how AI will rip through the transportation sector. Lets examine where the legal land mines may lie.
Nothing in this article should be construed as legal advice.
Crew Scheduling
The 2022 Southwest Airlines crew scheduling crisis resulted in 150,000 flights grounded in one of the busiest holiday peak travel seasons. This led to about US$600 million reimbursements to passengers plus a US$140 million dollar fine by the US Department of Transport recently announced in December 2023.
Some context. This is the first major end of year holiday season after the years of the pandemic. It was a time to see family and gather in close physical proximity, just like old times, some, for the first time in 2 or 3 years. Many airlines, having had slim years during the pandemic, were running at a loss. To increase profits, airlines had to have quick turnaround times (for aircraft to land, passengers to disembark, and quickly have the aircraft prepared for the next flight). As AI company “Rosterize” (
https://www.rosterize.aero) says, “fewer crews, more flights”.
Was the aviation industry continues to struggle with labour shortage issues, AI could be relied on to do more with fewer employees, by deploying employees towards various tasks more efficiently.
Key legal issue:
- Potential bias, leading to employment related claims. Lack of algorithmic transparency in AI solutions can make it difficult for employers to cure such bias
Solution:
- Develop a policy guidelines to ensure that any potential bias is mitigated. The IEEE P7003 Standard for Algorithmic Bias Considerations is one IEEE ethics-related standard that can be referenced. IEEE P7003 in particular was last approved in September 2023 and will be revised in Dec 2024. One interesting development in the IEEE P7003’s September 2023 update is that it no longer only controls for negative bias, it now controls for both negative and positive bias. Reviewing AI solutions against such standards require an assessment of whether the standards themselves align with the employment laws of the jurisdiction in which the employees are employed.
AI pricing experiments
Dec 2023 - Delta Air Lines is using AI to help their human analysts determine the amount passengers are willing to pay for their premium services. Delta Air Lines CEO Ed Bastian was reported by Aviation Week “We have over $40 billion of assets that we’ve invested in [our capital base]… If digital and AI just can increase the value of that just by 2, 3, 4%, you’re looking at billions of dollars of opportunity over time.”
Key legal issue:
Data protection. In order for pricing analysis to be carried out, large amounts of financial and personal data will have to be collected or accessed.
Solution:
Enhanced data security and cyber security measures. However, ensuring the security of such data will now also be an added cost to the business.
Satellite data for Maritime Industry Risk Detection
Satellite data has for many years been used for detecting pirates, smugglers, terrorists and adverse climate and weather events that impact the operation of sea going vessels. (See here for a quick video introduction by Windward) The challenge now is not so much getting access to such data, but is in interpreting the data. AI is now a powerful tool in producing predictions and analytics for the industry.
Key Legal Issue:
Impossible to attribute liability for false or even grossly negligent recommendations due to lack of algorithmic transparency (which predictive analytics firms will need to maintain confidentiality of in order to preserve their value). This creates a catch 22 situation - there appears to be no easy way to seek recourse against grossly negligent predictions.
KYC and anti-terrorism breaches. An easy way for smugglers and terrorists to get ahead would be for themselves to access the satellite data and predictive analytics. Unfortunately, the companies selling such analytics software often are not in highly regulated industries like banks and law firms, so KYC and anti-money laundering checks are often not in place.
Solution re liability attribution:
Consider reversing the burden of proof when the technology and the players in this field become more sophisticated, or reverse the burden of proof now but cap the claims limit.
Solution re KYC and anti-terrorism breaches:
Businesses handling satellite data relating to defense and anti-terrorism should be required to implement KYC, anti-terrorism and AML procedures.
Conclusion
AI is definitely coming for your job. I’ve merely covered a few instances here of recent relevance. Sorry I know you were looking out for some kind of conclusion relevant to the analysis above. But if I wrote something that made complete sense you’d think I relied on some kind of large language model AI to conjure it up.
Prior to the Southwest Holiday incident in 2022, despite more than 10 years representing key players in the aviation sector, including airlines, I never knew there were professionals dedicated to crew scheduling!
That’s really all I can think of. So go out there and own your own business, or stay in the business of regulating AI and the rest of us.
I guess that’s what I would be afraid of if I had to write the article under the guise that it was made with 100% human effort. Oh whoops. Was I supposed to say that?
Nothing in this article should be construed as legal advice.

