Their codebase has to be a wild ride. There are still traces from the days it ran as Windows binaries in CGI fashion (very much doubt they still do it like that, just like a large number of eBay URLs still contain the ebayISAPI.dll bit from the time the site was literally a Microsoft IIS plugin). On the DB instance it also outputs like five different generations of UIs depending on the exact parameters you hand it. You can also find several different iterations of APIs, like one search API will output actual, non-trivial JS code to fill auto-completes, another gives you JSON with a totally different format and yet another XML with everything slightly different again. From their job postings it follows their backend is C++.
The previous system came from the 80s and ran on multiple Tandem/Nonstop clusters.
As far as I am aware, although the system is very capable it started as someone's PhD thesis in the mid 80s and has been developed since then.
I worked on the data prepartion for a small HAFAS installation at one point and the specification was an absolute mess of fixed with text, delimited text and XML, sometimes all in the same file!
https://www.hacon.de/unternehmen/