September 22, 2004 at 10:37 pm
This may be a dumb question, so apologies in advance.
Why do airports and airlines have 2/3/4 letter codes in the 21st century? Would it not be easy to use a longer more intuitive code?
I mean, most of use know MME=Teesside or GCI=Guernsey. But it is very unintuitive and a code like “TEESDE” or “GRNSY” would be better, if the whole name couldn’t be used?
By: MontyP - 26th September 2004 at 20:34
For the World Tracer Baggage System we use the airlines 2 letter code eg U2-EasyJet, UX Air Europa. If you were to enter the flight number Ezy… it would revert to U2 when you enter the file.
Also we use two digit codes for the handling companies, 08 Aviance 04 Servisair etc
By: EGNM - 26th September 2004 at 12:55
ICAO airline codes are 3 digit alphabetical codes (BAW, AFR, FCA, EZY, MON). From a passenger perspective, airlines generally choose whichever code they think passengers will remember; however, most flight plans would use the 3 digit ICAO codes.
For a list of codes, see here.
All flight plans run on the 3 letter codes, or aircraft registration, you cant use the 2 letter for this purpose
By: SHAMROCK321 - 25th September 2004 at 20:24
ye the dreaded TPM ut its all in the books so we dont have to LEARN them.Havnt come across MCOs yet.
TPM=Ticket point mileage.
By: danairboy - 25th September 2004 at 12:19
Good luck then, its nearly 15 years since I sat mine. Do you still have to learn TPMs and MCOs? Anyway, all the best! Hope they prove more useful to you that they did to me. I have never once filled in a blank IATA ticket since those exams!
By: SHAMROCK321 - 25th September 2004 at 08:04
danairboy Ill be sitting that exam in march.BTW most airlines are not IATA members.Alot are but certainly not most.
By: danairboy - 24th September 2004 at 00:19
I had to learn some of the more obscure airport codes and airline designators for various IATA ticketing and fares exams I sat many years ago. There was various decoding and encoding questions. Some are guessable, some are not!
Its a bit like French verbs, their boring to learn but the only way is to commit them to memory
By: Pablo - 23rd September 2004 at 23:11
So why do we have both 3 and 4 letter codes? And why do airlines have 2 or sometimes 3?
As far as I understand 3 letter airport codes are IATA (International Air Transport Association) codes. These are primarily used by airlines, airports and travel agents for route planning purposes. Most airlines are IATA members. For this reason, some ports and railway stations also have IATA codes. 4 letter airport codes are ICAO (International Civil Aviation Organization) codes. The ICAO is responsible for standardising and co-ordinating civil aviation and almost all civilian flying on a global scale, which includes ensuring national civil aviation authorities comply with their standards for beacon markers, VORs, ATC, accident investigation. In essence, the ICAO regulates bodies such as the CAA and FAA.
In terms of airline codes, 2 digit codes (e.g. BA, AF, DP, U2, ZB) are IATA codes. These are alpha-numeric. ICAO airline codes are 3 digit alphabetical codes (BAW, AFR, FCA, EZY, MON). From a passenger perspective, airlines generally choose whichever code they think passengers will remember; however, most flight plans would use the 3 digit ICAO codes.
For a list of codes, see here.
By: mongu - 23rd September 2004 at 22:59
I would personally kill anyone that changed or proposed to change the airport codes. The thought of having to change every airport code in the computer system of the transport company I work for is enough to give me the shivers. Really, you should not mess with something that is working really well.
For passengers it may not always be logical. Well, so what? How many airports display their flights just by IATA/ICAO code? How many tickets are printed with just the IATA/ICAO code? There are none, the city and/or airportname are always listed as well. It’s only us geeks that are interested in IATA/ICAO codes.
Us geeks may also like to know that to some extend the 3 letter codes are also used for seaports. At least within the company I work for so I am assuming this is used worldwide. If I want to ship a container to the port of Houston I enter HOU, Montreal is YUL etc.
So why do we have both 3 and 4 letter codes? And why do airlines have 2 or sometimes 3?
By: SHAMROCK321 - 23rd September 2004 at 19:18
Im studing travel an tourism at the moment and part of it is Gallileo computeriszed reservation system.I know alot of the IATA 3 letter codes but alot of people dont so having to learn all those 3 letter codes is bad never mind have to learn 4 letter codes.
By: Pablo - 23rd September 2004 at 10:00
As Tenthije says, 3 letter IATA codes are used for connecting modes of transport, such as seaports and railway stations. Changing them would affect almost every online travel agents’ booking system and logistics system. ICAO codes make more sense to those who understand the coding system, but in many countries often bear little significance to the name of the airport, VOR, beacon etc…
By: tenthije - 22nd September 2004 at 23:05
I would personally kill anyone that changed or proposed to change the airport codes. The thought of having to change every airport code in the computer system of the transport company I work for is enough to give me the shivers. Really, you should not mess with something that is working really well.
For passengers it may not always be logical. Well, so what? How many airports display their flights just by IATA/ICAO code? How many tickets are printed with just the IATA/ICAO code? There are none, the city and/or airportname are always listed as well. It’s only us geeks that are interested in IATA/ICAO codes.
Us geeks may also like to know that to some extend the 3 letter codes are also used for seaports. At least within the company I work for so I am assuming this is used worldwide. If I want to ship a container to the port of Houston I enter HOU, Montreal is YUL etc.
By: Whiskey Delta - 22nd September 2004 at 22:48
Remember those codes are for the use of industry not for the convience of the public. 2 letter codes for marker beacons, 3 letter codes for VOR’s, 4 letter codes for airports (convention drops the first letter most times) and 5 letter codes for navigation fixes. A small change of adding 2 or 3 letters to an airport ID would throw off every navigation computer system out there, on the ground and in the airplane. So rather than change every computer in existance I think the 3 or 4 letter codes will stay.
For those in the know, every airport ID is intuitive to a certain extent as long as you know the history behind it.
By: MINIDOH - 22nd September 2004 at 22:45
Its easier in my opinion just using the ICAO code with 4 letters.
There arent many people who should know and dont know that EGLL is Heathrow, and there arent many people who should know and dont know that KJFK is Kennedy International. But LHR is pretty simple to me, and ok Teesides might not be, but think of how many Teesides there are in the world, and how many Manchesters…
Im not saying I disagree, but they must be there for a reason.
By: MANAIRPORTMAD - 22nd September 2004 at 22:40
Ive wondered that as well, maybe it would just be too much fuss to change them, I dont know really, but I’d be interested to find out!