alphaspirit - Fotolia

Evaluate Weigh the pros and cons of technologies, products and projects you are considering.

Is it a mistake to teach Java as a first programming language?

Is it right to teach Java to students as an introductory programming language? If not, what are the alternatives to Java when it comes to learning how to code?

Java remains the most popular programming language, but few people would call it the coolest programming language. Some may even argue that it's an increasingly outdated language and, for that reason, many educators are reluctant to teach Java.

So that begs the question, should schools and universities still teach students Java programming? And, if not, which alternatives to Java are the most viable?

Who wants to teach Java?

Back in my college years -- about a decade ago -- Java was the go-to programming language in introductory computer science courses. If you wanted to learn a different language like C, Python or PHP, you'd have to take a course dedicated to that language or its related applications.

Fast-forward to the present, however, and Java no longer enjoys that hallowed university status. By 2014, Python had replaced Java as the go-to programming language for introductory computer courses at major universities, according to findings from the Association for Computing Machinery.

This research wasn't comprehensive, though, as it focused on only the most selective U.S. universities -- and it represents a single data point that is now several years old. Still, I have a strong feeling that if you conducted a comprehensive survey of the programming languages used in introductory computer science courses, you'd find that Java is not at the top of that list.

For those with a passion for Java

For those with a passion for Java, and all things brewing in the Java community, here are some interesting articles with which you will find some solidarity:

I'm sure plenty of departments still teach Java, but I suspect that a majority have shifted to another language -- probably Python -- for their introductory courses. You might even find the same thing in high schools that teach computer programming.

Reasons not to teach Java

Is Java's decline a good thing or a bad thing? That depends on who you ask, of course. But, in general, there are good reasons to change how computer science programs teach Java and other programming languages in the early years of school.

Java is verbose. Java programmers who are honest with themselves will admit that Java is a more verbose language than most in the sense that it takes a fair amount of code to achieve a simple task. Maybe that's okay if you're a professional programmer and can churn out code quickly.

However, will a student trying to learn to program really want to have to write three or four lines of code just to print a single string into the terminal? Python, for instance, only requires a simple line of code:

"echo 'my string';"

Factor in learnability. You might argue that Java's status as the most widely used language means that everyone should learn it. After all, plenty of professional programmers use Java daily. Lots of important applications are written in Java and, even if everyone stopped writing new applications in Java, we'll no doubt be maintaining legacy Java codebases for decades.

However, the fact that it is the most popular enterprise language and will remain widely used for a long time to come does not mean we should always teach Java to programming students first. If you're a new computer science student who wants exposure to the essentials of application design and development in a simple way, Java is not the best starting point.

Java is a compiled language. That's good and well if you are a DevOps engineer building Java applications for a Jenkins pipeline. But if you just want to learn programming, it's not ideal to compile applications before you can test them. It's simpler to stick with a scripting language.

You can learn about build processes and delivery pipelines later if that's where your career takes you. And you may not want or need to. Not everyone who takes an introductory computer science course is going to become a professional developer and compile code.

There are alternatives to Java. One of the first rationales that you often hear for teaching Java is, "It's object-oriented!" It's true that Java is the poster child of object-oriented programming (OOP). Plenty of other languages, however, can be used for OOP.

OOP is a concept and an architectural strategy more than it is a feature of specific languages.

Plus, you can teach the principles of OOP with no specific language attached. OOP is a concept and an architectural strategy more than it is a feature of specific languages. Beyond this, the microservices trend is already making OOP less important. As microservices deployments do away with monoliths, OOP may not even matter for much longer.

There's no standard Java Development Kit (JDK). Between Oracle JDK, OpenJDK and various vendor-supplied JDK platforms, things can get confusing. The nuances between different JDKs could mean that code you write for one platform won't work properly on another, which can frustrate students learning to program for the first time.

You can try to control this challenge by requiring all of your students to use the same JDK, of course. But why not just avoid the issue altogether? Most other languages have just one standard implementation -- usually open source.

Java still has benefits

This is not to say that Java is a bad language to teach -- it has its selling points. Java is cross-platform. It has a healthy ecosystem of development tools, including Eclipse, that make life a lot easier for new programmers. It's also easy to find documentation and community support for Java because so many people use it. So, is it a mistake to teach Java? That might be a bit extreme. But Java's not the best first programming language to teach to students today. Languages like Python and C++ are better alternatives, for my money.

Next Steps

Are you a Java developer who's interested in mastering DevOps? Here are some resources and tutorials to introduce you to the most important DevOps tools and technologies being used in the industry today.

This was last published in November 2018

Dig Deeper on Software programming languages

Join the conversation

10 comments

Send me notifications when other members comment.

Please create a username to comment.

What do you think is the best alternative to Java when teaching students how to code?
Cancel
Python would be my first choice but if you want to teach statically typed language, I'd go for c#
Cancel
I might go for Groovy. It's scriptable, but also opens doors into the JVM world.
Cancel
The build environment or choice of tools are quite some superficial criteria for choosing an educational language. Compiling provides more insights into what actually happens when programming, so scripting is not an advantage.

The purpose of an educational language is to convey concepts of programming and software design (as opposed to "coding"). A first language should have a static typing system for clearer semantics. It should also support the most fruitful paradigm, functional programming (which is meanwhile supported by a bunch of languages but often in a limited or crippled way). I would go with Scala.
Cancel
I'd likely go with Scratch

https://scratch.mit.edu/about/
Cancel
I don’t understand how one can recommend C++ as a good alternative, seeing as the two languages are related and share a lot of the same syntax. I’d say C++ is even worse as you are dealing with more extreme granularity with things like memory management.
Cancel
I'd agree.
Cancel
There is nothing wrong with Java being the first Language introduced in Computer Science courses.

 Java is verbose- This is what is normally heard from people who talk about Java, but those people are people who programmed in most likely Java 5. Java is improving and is still one of the top languages to introduce students to. For example, before to initialize a new variable you might say int x=4; people do not know that in the latest Java you could use var x=4; just like other languages, you know. There are lot's more improvements in Java that people do not take note of, but instead, lecturers use lecture notes of 2004 to teach students in 2018. Java of 2018 is different from that of 2004.

learnability- I feel computer science students go through more though tasks than learning Java but those courses were not removed because they were not learnable. In fact, it has been seen that people who learn languages like Java would easily learn other languages. If you know Java you could easily learn C#, C++, Kotlin, Scala, Go... the list goes on. 

There's no standard Java Development Kit (JDK).- I don't see how this could be confusing, everyone knows Oracle JDK as the popular JDK, moreover, it's free for use in development and test environments. If the lecturer does not want to use the Oracle JDK then he/she has a lot of options like the AdoptOpenJDK builds. Personally, I don't see how this should be an issue for a computer science lecturer to guide the students.

If it's high school students that want to learn to programme then, they could go with simple languages. But as a computer science student, Java shouldn't be a problem.

In fact to suggest that C++ is a better alternative to Java is confusing because C++ is more complex than Java. 






Cancel
I totally support this comment !
Cancel
I would say Java is a great language to teach. Python is also a great alternative but if you are teaching OOPs concepts, blackboards and theories are not enough and hence you need hands on learning for which Java comes in!
C++ NO! Its too overwhelming for new users. 
Python! Yes, it's a good alternative.
Cancel

-ADS BY GOOGLE

SearchCloudApplications

SearchSoftwareQuality

SearchHRSoftware

SearchSAP

SearchERP

DevOpsAgenda

Close