Java's lambda syntax rigidity exposes spoiled programmer's frailties

Consternation over Java's lambda syntax is the perfect example of how Java's evolution and incremental improvements have created a community of spoiled programmers.

We're spoiled.

In the good old days, coders asking how to do X in C were told "this is how you do it." Now we say "wouldn't it be nice if we could," with a realistic chance of having it happen.

As Java coders, we live in a golden age. We have more freedom to create techniques and approaches than we've ever had - and as a result, it's easy to forget how good we actually have it.

I was watching a chat channel a few days ago, and saw a number of people discussing things like scope in lambdas - protesting the requirement for variables used inside of lambdas to be made final. (Why? Because of the way lambdas are implemented, as inner classes; see the JLS, section 8.1.3.) They suggested new syntax to shadow a reference as final, sort of a way to autopromote finality.

It was a neat idea. To be honest, though, my first response was to think that it wasn't necessary, that the coder should just get over it, and use a single-element final array instead:

// this reference is not final
String foo="foo";

// this reference is final, so it can be used in a lambda
final String[] fooWrapper={foo};

After thinking that the appropriate response was to shake my cane while shouting "Get off my lawn!," it hit me that my response was outdated. There is absolutely no reason not to push the envelope further - and people do it all the time, in marvelously different ways.

For example, I'm a fan of Lombok, which provides annotations to remove typical boilerplate for accessors and mutators in Java code, among many other features. One of those features is val - which will create a final reference (suitable for lambdas) and use type inference to boot - just like Scala's val keyword.

For example, with Lombok you could take the following code:

List<String> strings=new ArrayList<>();

... and replace it with this code:

val strings=new ArrayList<String>();

It's not quite equivalent - strings uses the actual inferred type, instead of the interface - but that's not actually that much of a loss, all things considered.

We have more freedom to create techniques and approaches than we've ever had - and as a result, it's easy to forget how good we actually have it.

As an aside, Lombok has many other useful features implemented as annotations - @Builder, @Synchronized, and @Wither. What's more, I'm picking Lombok for the example because it's probably the best example of such things - but you could also consider things like Google's @AutoValue as other examples of similar techniques.

The thing to think about is that val and its ilk are new features, supported entirely at compile time, with no runtime dependencies whatsoever - and that gives the coder quite a bit of power without much loss of expressiveness.

The paradox of choice

Contrast that to the days when dinosaurs ruled the earth and we coded in K&R C. If you wanted a feature, there'd be some old codger (like me) hanging out, scoffing at your desire for newfangled widgetry, and we'd happily tell you that you could abuse the compiler by writing a macro (or a set of macros) to do that via token replacement.

In some ways, it was more convenient then - with so few options, it was easier to have a path forward, because it was the only path forward. People occasionally complained, sure, but the truth of the matter is that they went ahead and wrote their code, because they had so few other options.

There's really a tradeoff here, of course. Back in the olden days, when our best choice for watching TV on our own schedule was to hope the VCR worked, chances are that we went ahead and wrote our macros, if we really needed them. It generated some truly arcane code (look at Nethack's source code, which is actually really well written), but we didn't complain quite as much.

(It's also worth pointing out that people did still advocate for the languages to change - witness C's various standardization efforts over the decades, especially with the introduction of ANSI C, with far kinder argument lists for functions. However, as a counterpoint to my own counterpoint, it's also worth pointing out that today's C is far more recognizable to the C coders from the 1980s than today's lambda- and generics-infused Java code would be to the Java coder of 1998.)

But at the same time, we have a marvelous and glittering world opening up; now, if you were to throw a tennis ball into a crowd, chances are you'll hit three or four people who program. That creates a lot of movement in the world of programming; sure, it means we might get a lot of crackpot ideas ("Can we write this in Ruby?") but at the same time, if we see ideas on their own merit, we have a fertile environment for growth and development, with or without Oracle's active involvement.

And that is a wonderful thing.

You can follow Joe Ottinger on Twitter: @josephbottinger

Next steps:

Excellent programming is about function, not form 
APM and programming does not need to be that hard 
AWS Lambda leads charge in event-driven computing

This was last published in June 2016

Dig Deeper on Open source Java

Start the conversation

Send me notifications when other members comment.

By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

Please create a username to comment.

-ADS BY GOOGLE

SearchCloudApplications

SearchSoftwareQuality

SearchFinancialApplications

SearchSAP

SearchManufacturingERP

Close