Regarding Java

Pages: 123456
quirkyusername wrote:
why not just use the best tool for the job in all situations rather than argue which language is 'best'
There is no such thing as "best tool for the job" . Sure you could say that you won't be writing a boot loader with perl, but that's a corner case. Imagine you want to write a compiler. I say you should be using FP. Prove me wrong (it can't be hard..)
closed account (S6k9GNh0)
No, I just find it difficult to think of everything as an object. For instance, we can picture a binary tree as an object and we can picture the binary tree nodes as an object. But, I can't really picture some of the given utility functions that help parse the data that comes out of the binary tree as an object. In Java, you'd make a class called Utility or something and implement them as methods. I don't like that.
Last edited on
I don't think the problem is everything being an object. Python, Ruby and others accomplish this without the Java nonsense. It's great for introspection. And if you don't need OO capabilities, the model just gets out of the way.

IMO, the problem with the way Java does it is the idea that being OO means having to define a class even when you just need a function. It's the "Kingdom of Nouns" problem: http://steve-yegge.blogspot.com/2006/03/execution-in-kingdom-of-nouns.html
There is no such thing as "best tool for the job" .


I tend to disagree. Clearly some tools are better than others for specific jobs.

Taking that idea a little further philosophically, we can say that for a given set of requirements there does not appear to be a single best solution (tool) to meet them. However, as requirements are increased I beleive solutions will converge to a single best. In other words, if one best tool for the job isn't apparent (and having one was your goal), you need to acquire more information.

Obviously that may not be practical but practicality doesn't disprove the existence of a best tool for a job.
But, I can't really picture some of the given utility functions that help parse the data that comes out of the binary tree as an object.

Regarding algorithms as objects can be very useful under certain circumstances.

And to the best tool thing: It depends. A chain saw is arguably a better tool for cutting trees than a handsaw, but if you don't know how to operate a chainsaw you are better of just using the handsaw instead of using the chain saw with half assed knowledge and ending up cutting your legs off.
Some tools are better than others for specific jobs, but there are many that offer different equally fine solutions and even more that offer identical solutions with slightly different syntax. It is much more important whether a programmer feels comfortable with his language of choice.
I agree that with sufficient number of requirements, one might find the superior tool, but I don't think you ever have enough in real life.
Also, it seems to me that it is a lot easier to find a solution to a problem when thinking in terms of a specific language (or paradigm) and not abstract algorithms that can be applied to any language which best suits your needs.
What are the semantic differences between this (C++) ... Java...
In the OO Paradigm, there's a subtle difference between static members of a class and a namespace. Outside of the OO Paradigm there's no point discussing Java constructs.
I was chatting with a friend of my roommates over vintrillo Today. He said he wanted to wait until he learned a powerful language like java before he tried anything difficult like OS design, right now he's stuck using c++ for school. He's in his first semester of computer science So it's understandable that he's a little confused. But here's my buff with this; if what he says is true his school teaches language constructs with c++ and then everything else they do is with java. They don't even bring up any paradigm beyond OOP. I told the guy if he ever has any questions feel free to ask me (as I've had several years of experience beyond my formal education as well as an environment that heavily stimulated my comp sci ventures). His story could be a bit fabricated but if it isn't that's just plain sad.

Taking that idea a little further philosophically, we can say that for a given set of requirements there does not appear to be a single best solution (tool) to meet them. However, as requirements are increased I beleive solutions will converge to a single best


When treated as a multicriteria optimisation problem, it is just the opposite. If you have just one criterion there is usually a single best tool. When you add more requirements, you get a set of pareto-optimal tools. Programmers assign different weights to the requirements, so you get the endless language holy wars: "C++ is faster than Java", "But in Java you write code faster", and so on. And don't forget that familiarity with the given language is also a criterion. Someone who knows C very well and doesn't know Java would be more productive in C than in Java until he learns Java to some level.


Outside of the OO Paradigm there's no point discussing Java constructs.


Because outside of OO paradigm, they are equivalent to the procedural paradigm, just as in the code I posted. The only distinction is writing "class" or "object" instead of "namespace". You could say they are not equivalent only if to write purely procedural code in Java you would have to put more effort than in a pure procedural language (like C). But this is not the case. It is even less code in Java, because in C and C++ you have to write function prototypes to allow intermodular calls, and in Java not. Java supports procedural and modular paradigms fully. Just think of classes as namespaces and ignore all the rest.

BTW 1: Why are compiled C files called "objects" and get an ".o" extension?
BTW 2: Java is a multiparadigm language just as C++, so restricting discussion to OOP only is plain stupid. Supported paradigms: procedural, modular, OO, generic. If you try hard enough you can also do FP (to some very limited extent, but at least it supports closures, while C++ does not) or AOP in it.
Last edited on
Why are compiled C files called "objects" and get an ".o" extension?

Because that's what compilates that contain executable code and metadata are usually called? http://en.wikipedia.org/wiki/Object_file

They don't even bring up any paradigm beyond OOP

What kind of school is that? I'd have thought procedural/modular programming is taught before object orientation, as those concepts are easier to grasp-- though I gotta admit, in the only "computer science" classes I have visited so far (didn't really deserve the name, it was more about computers than it was about science), the "OOP" consisted of teaching pupils to stuff everything into classes (basically, they taught how to program procedurally with classes instead of OOP), with the pupils winding up neither understanding procedural nor modular or object oriented programming.
closed account (z05DSL3A)
rapidcoder wrote:
Why are compiled C files called "objects" and get an ".o" extension?

I have a feeling that it comes from the bastardisation of objective. As in the source code gets compiled to the objective code (meaning target machine language code). Over time this gets shortened to object code...
If it's the target of an action, it's literally the object of the action (as in subject/object). Anyway, I think the expression "object oriented" only went mainstream much later than "object code".

closed account (S6k9GNh0)
They don't even bring up any paradigm beyond OOP


My High Shcool had "computer science" which consisted of nothing. They started with something called Scratch, then used Alice, then moved to the big boy language called Java. However, we weren't taught any of the programming concepts, we were just taught that this does that and that does this. We weren't taught about design, etc.
So, in other words, it was taught just like any other subject? I do recall memorizing a multiplication table early in school--before learning how to multiply.

Since the majority of Java is OO, I wouldn't expect anything different.
closed account (S6k9GNh0)
@moorecm: That's funny but that's sadly how it is. I personally learned a tad differently since I was taught slightly more difficult mathematics (and a few other subjects) at a younger age but it sucked to see my friends struggling with basic math all the time because how poorly they were taught as a group.
Last edited on
This is the problem with education. You learn to memorise things, not to understand them.
I was able to subvert this teaching paradigm simply by not paying attention in class. Didn't even hurt my marks, I actually ended up scoring better than average in almost all subjects :O
My method is to work hard in class and then be too apathetic to do any work at home. It means I get low Ds when I should be capable of getting an A in every subject I took.
Chrisname that is my method incarnate...well it used to be. I'm quickly revising my ways.
@chrisname sounds similar to my method. But somehow I've managed to change, too late of course =)
Pages: 123456