I'm tired of reading all these blog posts about how Java is so much worse than languages like Python and Ruby because Java programs take more lines of code or because type-checking is evil or because Java programmers are morons.
There's a reason why most enterprise applications are written in Java, not Ruby, and it's not because Java programmers are stupid.
The main argument I'm hearing recently is that "more lines of code for the same job is automatically bad." This is clear and sensible on the surface. More lines means more chance for a bug. More lines means more to comprehend when you're maintaining code. If projects get too big, no one can understand it completely.
This glosses over the fact that many applications have no choice but to be millions of lines of code. I know, 37signals wrote an application in 600 lines of code and they are millionaires. But AutoCAD is going to be big, Quickbooks is going to be big, FireFox is going to be big, and yes, even a code review tool like Collaborator is going to be big.
So let's pretend that FireFox was rewritten in Ruby, so instead of being 10 million lines of code it's only 5 million. I don't think that sounds any easier to maintain -- 5 million lines of code is still more than anyone can fathom.
So my question is: Which language makes it easier to work on large code bases?
In Java I can refactor with confidence. Move code around, change signatures, add a function to an interface, and I know nothing is broken. Python can't.
In Java I can say "Show me all the places where this code is called." When changing the behavior of a method, this is critical, especially in a large code base where you have no way of tracking all those callers. Ruby can't show you that.
In Java if another developer checks in a change that breaks types and signatures on what I'm currently working on, I see it immediately and I can fix it now. Big projects mean lots of programmers which means things like that happen all the time. In Python I won't find out until run-time, and even then I might not find out until it's at a customer site, doing something I didn't test for.
The second reason I don't like the "more lines of code is bad" argument is that not all lines of code are created equal.
It's true that in Java you tend to have lots of filler code. Bean-style fields are a simple example -- instead of just having a field that other code can access you typically make a "getFoo()/setFoo()" combination, and then you add JavaDoc to that, and soon you have 20 lines of code instead of 5. Another example is in patterns like Adapter and Composite where you end up doing a lot of pass-through code.
The typical counter-argument is "The compiler can write that for you." The typical response is "Yes but I still have to read that and make sure it's not doing anything funny if I'm a client of that code."
True, but still those lines of code are unlikely to have bugs, particularly if you have unit tests. More work, yes, more lines of code, yes, but more bugs as a result? No.
The fact is, most of that kind of code is written once and never changed, and that's fine. Does that really make the system more difficult to understand?
Plus, in this day of unit testing, if you're doing that well it means you can leave even more code unexamined. Just seeing how it works (the unit tests and documentation) is enough.
Yes, you can unit-test your Python and Ruby code as well, but that doesn't stop another developer from injecting a method into your class at runtime that breaks everything.
In the end it comes down to which trade-offs you like best or which match the goals of the project. In theory anyway.
But to me, I need a good reason to abandon a JVM that's faster than a C compiler, IDEs like Eclipse, tools for correctness and profiling and debugging and analysis, the vast array of quality libraries, and millions of people who are already familiar with the environment.
"Fewer lines of code" and "I don't need types" isn't enough.