ra2 studio - Fotolia
For effectively all new development, the COBOL language is irrelevant. Many seem to think that Java is irrelevant, too, but I don't think that's the case. The problems that the languages were trying to solve are different, and Java's still appropriate for current architectures, where COBOL is not.
Recently, a discussion popped up about why Java used int for array sizes (in the context of someone wishing for Strings that had more than Integer.MAX_VALUE characters). Someone referred to it being similar to the 2038 problem, and someone made a reference to the COBOL language, and that got me thinking.
The COBOL legacy
Why do we refer to COBOL as being dead, when there are claims that there are still more lines of COBOL still in use than in any other programming language?
COBOL is the Common Business Oriented Language, a product of a team including Rear Admiral Grace Hopper. It was never really considered cutting edge, except possibly in one dimension: It described things very clearly and precisely. (That was part of its design: it was meant to be a language that used somewhat-natural English.)
Data types were allocated statically, and specifically. If you needed a floating point number with seven decimal places of accuracy, well, that's exactly what you asked for: You'd describe it as PIC 999V9999999. There were eventually smaller data types (packed decimals and such) but the general concept was that you described exactly what you wanted, and that's exactly what you got.
Likewise, program code was simple. You described code in terms of procedures, or paragraphs. The actual executable code was simple to the point of absurdity:
001000* THIS IS A COBOL COMMENT
001010* LOTS OF REALLY VERBOSE STUFF HERE...
001020 ADD LINE1 LINE2 LINE3 LINE4 GIVING SUB-TOTAL.
001030 MULTIPLY SUB-TOTAL BY TAX-RATE GIVING TAX.
001040 ADD SUB_TOTAL TAX GIVING TOTAL.
001050* MORE REALLY VERBOSE STUFF HERE.
Ah, those were simpler times. You could always use a less grade-school method of calculating the total, of course, even in COBOL; it had fairly advanced computational verbs (unsurprisingly called COMPUTE, if you can imagine), but there was an actual value to the simplicity.
The original power of the COBOL language
Remember, this is the Common Business Oriented Language. This was before big data, it preceded data science, and machine learning was magic; even Amazon Mechanical Turk classification was far in the future and seemed like a form of voodoo, and AIs as simple as 2001's HAL were beyond imagining in terms of actual code.
COBOL was designed to provide computing power to businesses. That meant explaining things to stakeholders in a time when the most complex machines with which most businessmen were familiar were mechanical adding machines.
To that clientele, it was comfortable to read very simple descriptions of how things were done -- add these to get that, multiply this and that and then add to get the final result.
Now fast forward from then (whenever "then" is, probably as late as the early 1980s) to today.
The situations are different. Businessmen are used to computers -- their smartphones pack more processing power and memory than the hottest mainframes of yesteryear. This builds a level of trust in the power of the machine.
Stakeholders are also used to not having as much control as they used to. I once had to explain how specific lines of COBOL code fulfilled the tax law (very specifically!) to a pair of senators. Now, they'd be familiar enough with programming to not need to be walked through such simple expressions. I would no longer have to show them code that I hoped they could understand because people are familiar enough with the concept of programming that the generalities are enough.
Also, the problems are different. Back in the 1980s, most "real" code (meaning: code that ran businesses, as opposed to games or something that ran on those silly microcomputers) was designed around batch processing. Systems would accumulate changes over a period of time (often a day) and process every one of those changes in a single run of a program.
Now, we have systems that respond to hundreds of thousands of transactions in real time. Credit card companies can determine a likely invalid use of a debit card in seconds, sending a notification to a user that his card number might have been stolen. Stock market trading applications, too, react in less than a single millisecond, applying incredibly complex algorithms to make decisions.
I can see appreciating COBOL's simplicity in expressing some of those algorithms, although the nature of variable declarations would drive one mad today. But I can't imagine COBOL being good at expressing some of these complex algorithms -- they're just too complex. They beg for abstractions that COBOL struggles to provide.
You can still see this in action today, if you're lucky. Look at your bank account. Charges and deposits accrue during the day and are often applied (or posted) around midnight. This is very reminiscent of the old dinosaur COBOL programs -- and if the truth is told, it's probably COBOL that's running that process even today.
My initial thought here was to compare Java to COBOL, in light of Java's age (nearly 20 years old now, slightly younger than Ruby, which is itself younger than Python). The instigator was thinking that integer indexes on arrays seemed archaic, and in a day when many deal with terabytes of data maybe it is.
But it's not just 32-bit indexes that make or break a language, it's how well that language can represent and simplify problems. Object orientation for the win, in terms of expressing simplicity, but note that there are object-oriented features for COBOL. To this dinosaur's eyes, they look decidedly non-COBOL-ish, but I'm sure they're great. For someone else.
COBOL suffered over time because programmers wanted and needed to be able to write concise code that expressed complex concepts.
Java, too, has suffered some from this (thus why languages like Scala and Kotlin, among many others, are rising.) It's far more concise than COBOL, to be sure, but compared to Kotlin or Scala, it's emphatically periphrastic.
32-bit indexes seem to be a reasonable optimization from Java's earliest days, when computers with 64-bit word sizes were rare; a 64-bit index would add all kinds of useless complication to array accesses even today, meaning that uses of String and most of the collection API would have been ground to a halt with synchronization and other requirements. (In Java, updating a long is not an atomic access.) Furthermore, using a long for array access is entirely unnecessary for nearly every application in programming. Most arrays are tiny, so forcing them all to handle sizes that would exceed most machines' available RAM would have been stupid.
However, I'd say that Java, while verbose, is not so verbose that it's not likely to be viable for the next ten years or more. For one thing, it's gained features (like lambdas) that help reduce its wordiness while adding capabilities; for another, it's object-oriented from the very beginning, making it a good bridge between functional and object-oriented designs.
COBOL has withered, for the most part, because the language was never designed to handle problems such as those we're confronted with almost daily. Java has not, because it was.
Java 9 promises modularity and new value types
Modularity in Java codebase makes legacy applications more manageable
With Java EE 7, your Design Patterns are dead. And your EAR is ugly too.