Recently, New Jersey Governor Phil Murphy was trying to explain why the unemployment insurance systems were overwhelmed, and chose to blame a shortage of COBOL programmers. Since then, I have seen a lot of press about this shortage of COBOL developers, including an article with the alarming headline "An old programming language is threatening global stability". In discussing why there is such a shortage, often the blame falls on young developers, who they claim simply don't want to work on an old language, and have been attracted by the shiny objects in the newer languages.
I have worked as a software developer and consultant since 1981, and have worked at a lot of mainframe sites, and I don't feel that this is a realistic representation of the situation.
Here's a graph showing NJ unemployment claims from 2008 to 2020: New Jersey UI claims. You can see that the NJ unemployment system was perfectly capable of processing less than 20,000 claims per week, but in early April 2020 there were almost 215,000 claims.
The 'system' here, remember, includes the phone and in-person support to explain how to make a claim, and the paper and online forms to file a claim, as well as the mainframe application to process the claims. I know nothing about this specific application, but I'm going to guess that the critical piece involves an overnight batch process. The key with most nightly batch jobs is that you have to shut down online access while you run the batch jobs. So if your batch jobs have to process 10 times the number of records that they usually do, they might run 10 times longer, and might not be finished by the time you want to bring up the online system. If you cannot bring up the online system, you cannot start to process the next day's claims, and you get a lot of people shouting. 10 times does not sound so bad, and it could turn a two-hour job into 20 hours.
Now if that long-running batch application is written in COBOL, how could having more programmers help out?
So basically, no, having more programmers would not make the application run faster.
The way to make a batch application run faster is generally to throw hardware at it. Upgrade the disks, upgrade the CPU, etc. But upgrading hardware is an expensive ongoing cost, and if you upgrade the hardware so that you can handle 300,000 claims per day, but the claims drop back to 10,000 per day, then you have a large, unnecessary cost. Wouldn't it be nice to have the system running in the cloud, where you could just spin up extra power when you need it? I'm sure the salespeople are lining up (6 ft apart) to make a pitch to state IT departments already.
The trouble is that migrating business-critical mainframe applications to *nix servers is easier said than done (see my post about moving COBOL to the cloud), and many migration projects failed or stalled. Upper management don't want to invest in the mainframe, but they cannot get rid of it, so they keep the budget as low as possible, and surround the mainframe with other systems to add functionality like a web interface, ad-hoc reporting, etc.
Since the 1990s, there was a general feeling in the industry that the mainframe is dead and that mainframe systems would be rewritten or replaced to run on modern hardware (ie UNIX or Linux servers). I'm not saying this was a universal feeling, or that there were not a significant number of people pointing out how difficult and/or unnecessary this was, but you could certainly see this feeling in the upper management and in the budgets that were assigned to maintain the existing systems.
Coming up to Y2K, there was a concerted push to get as many applications as possible off the mainframe, and after Y2K, as the budgets dried up, the applications that remained were going to stay on the mainframe for some time. Here it is, twenty years later, and yes, they are still running.
So why is there so much noise about a shortage of COBOL programmers?
Well, many mainframe sites have job openings, so it would be reasonable to assume that they are simply unable to find qualified applicants.
Some US universities still teach COBOL - see this list from Micro Focus. Also, many overseas schools teach COBOL, and when working at companies all over the US, I consistently ran into excellent programmers who had only recently come to the US.
However, look at the pay being offered. Fifteen years ago, six-month contracts to work on mainframes for developer with 5+ years experience were paying $45 / hour. It was not considered great pay back then. If you poke around on job sites today, you will find jobs being offered at the same hourly rate as fifteen years ago! The best rate I saw was $60 / hour, but they wanted someone with 10+ years experience, and the list of required skills was really extensive.
Oh, and the worst thing about the jobs I looked at? They were all strictly onsite - no remote work.
It would be reasonable to assume that developers who know COBOL and who also know other languages, are attracted to jobs that pay better and have more flexible working conditions. Yes, there is also the attraction of writing new code and building a skill set for the future rather than poring over huge legacy programs to make minor changes, but new developers could be attracted to the legacy code if the pay was good enough.
I think that the problem is that the core IT organization is taken for granted and under-funded. If companies valued these skills, or were truly unable to fill these positions, they would offer more pay, and be willing to hire junior developers with less skills and train them.
I hope that not too many people will get caught up in the hype and train themselves on COBOL thinking they can get a good job or help out the states. If they are only paying $45/hour for a developer with 5+ years experience, what are they paying a junior developer - minimum wage?