• Forum
  • Lounge
  • The need for other programming languages

 
The need for other programming languages?

Pages: 12
Hey guys,

this sort of topic has probably been discussed countless of times but I would like to revisit it in 2019,

so first off I began learning Java circa 2016, I enjoyed many aspects of it but didn't like some aspects for one I felt that everything needing to be in a class was unnecessary, this wasn't the driving factor that made me migrate to C++, but I wanted to learn how computers worked at a lower level and from what I've read C++ and C fill this requirement for me although I've not really done much low level work if any with C++ yet but knowing that when I need to it's nice that I can, another big factor was that many parts of operating systems such as Unix and Windows are written in C or C++, and hence many of the direct operating systems APIs such as windows.h are written in C. I also like many other features in C++ such as operator overloading that languages like Java don't provide but I digress.

So why the need for so many languages? obviously Java was created to allow portability of code from one operating system to another, Python allows for quick and concise code with an easier learning curve than other languages, C# was microsoft's answer to Java and was designed to run obviously on just windows operating systems ( although I've heard this may change ). Rust was developed to solve C and C++ potential undefined behaviour allowing for less buggy code. skip a couple decades back and you have fortran,ALGOL,Delphi,BASIC. so why is it legacy languages like Fortran and ALGOL pretty much died out? what was the need for a new language like C to take over, couldn't these languages do what C could potentially do? so yeah why the need for so many languages, couldn't the developers of these legacy languages just have updated and upgraded their language to survive and thrive in the modern landscape of computer programming.

also why and more importantly how does a language like C or C++ offer greater lower level support? how can one language( C ) offer more lower level support than let's say another language like Java or Python?

thanks!

Last edited on
So why the need for so many languages?
-- there is no 'need'. We need < 10 or so of the current set of languages. I can't think of anything I cannot do easily in one of {sql, C++, javascript, shell/bat, html} that is not an artificial limitation (eg I can't make a quick excel code in c++ because it inherently uses basic, but that is artificial). And my short list left plenty of room (its nowhere near 10 as I said) for things I don't do much of.

Some of what you ask has no sensible answer. There isnt much you can do in C that you can't do in fortran. C won out due to it being native to unix OS when that took off, and its takeover in the classroom. I don't know about algol. The classroom issue is POWERFUL: the languages taught in schools are 'winning' just by nature of # of people knowing them. Java's success is almost as much tied to schools moving to it from C++ as it is the portability.

Many of these languages you mention HAVE been updated and ARE active and modern. Fortran was quite active and I was using it post Y2K on new code, using visual fortran (like visual studio) and had several DLLs that were from Fortran. BASIC is still heavily used. Many of these languages are used by people outside of computer programming, eg engineers or even 'power users' who know basic & python but don't code for a living (but as a hobby or to automate simple stuff at work/home). Just because you don't know it or use it, does not make it dead. "Sickly" maybe, but nowhere near dead.

The last question is really simple and you already answered it yourself.
Python, Java, etc ... are portable. The opposite of portable is low level, hardware aware code. You can't move code that has INTEL assembly embedded in the middle of it over to another type of processor. C++ lets you do that, java does not, because java forces you to be portable even if the code is 20,000 times slower while c++ assumes you know what you are doing and made your own decisions. These languages that lack low level support do so by design to prevent the coder from doing something that won't work on another system. Their lack of support for useful features is different: these languages lack unsigned integers, for example, because their designers believe that coders should not be writing bit-twiddle code, even if they needed a MD5 or CRC32, its not allowed: that isnt what they wanted you to be doing and by @#$% they are not going to let you do it! This is just 'baby-sitting' where you are prevented from using a feature due to the arrogance of the designers. Operator overloading is another classic gap in java (there are dozens).
Last edited on
Also note that the art of programming has been evolving over time. People change their minds about the programming techniques or paradigms that are useful to be able to support natively in languages; hardware has changed in speed, design, and characteristic; compiler technology particularly has improved immensely even over the last 10 years or so, both in terms of code generation and in techniques for things like type inference; the kinds of problems people are trying to solve are also typically different now than 20 years ago; and so forth.

That isn't to say that the "older" programming language are no longer useful, rather that the use-cases they were designed to facilitate are often less important compared to the newer features and facilities programmers have come to expect. FORTRAN is still virtually unparalleled in the field of numerical computation, but nowadays compilers are smart enough to get pretty much the whole way there in languages like C++ while still maintaining much greater usability and support for "new" paradigms that programmers find useful, such as functional techniques.

And then different people and different areas have different needs and things that they feel are important, and thus you get different languages facilitating to those. This is not even going into DSLs!
From my experience many of the newer languages were developed because their programmers found "all" existing languages "too complicated", "too error prone" or "totally seventies".
Where "too complicated" usually means: "I'm too lazy to learn".
"Too error prone" means: "I'm too lazy to be disciplined" Or "I'm too lazy to test thoroughly".

In short, most "modern" languages aren't really developed to solve general problems.
They were developed to solve some problems the initial developer had.
Especially the languages developed by relatively young programmers, whose main argument against "old" languages is, that they're old.

I bet every programmer thinks about developing 'some' language at some point in their career. A few of them actually do it. And a few of those few actually become known more widely.

Programming languages are no big deal. Students make them up as exercises. Programmers make them on the go to express business logic or to script a behavior. Computer scientists make them up as needed to test their ideas.

Out of tens of thousands of programming languages that exist (back before it was discontinued around 2003 or so, there was a project at Monash U that tracked over 8000 programming languages, with references for each one, I am sure it's over 20k now), only a few hundred make enough noise to be generally known, and only a handful become known to nearly every programmer in the world, each with a unique story.
Where "too complicated" usually means: "I'm too lazy to learn".
"Too error prone" means: "I'm too lazy to be disciplined" Or "I'm too lazy to test thoroughly".
What utter nonsense.

First, the point of technology is to make lives easier. There's no merit in doing something the hard way when there was an easier, equally effective way.

Second, regardless of how careful or how thoroughly someone tests, if something is error-prone an error will eventually slip through. There's value in replacing a construct with a less error-prone one. It's the whole reason we encourage people to use std::unique_ptr over naked pointers and std::lock_guard over lock-unlock pairs (and, more generally, RAII over manual management of resources), and why memory-safe languages like Java, C#, and Rust are successful. People make mistakes, period. The dumb solution to this problem is to say "well, they just need to be more careful". The smart solution is to build machines that can deterministically catch our mistakes and to use them.
The hotshots who think they'll never make a mistake so they don't need "safety nets" in their language are the ones writing all the hopelessly broken code other people have to fix.
The hotshots who think they'll never make a mistake so they don't need "safety nets" in their language are the ones writing all the hopelessly broken code other people have to fix.

Preach it, Brother!

:)
Last edited on
memory safe is good (c++ stl). Having to write workaround code to force something the language refuses to do (eg unsigned 32 bit integer manipulation) is frustrating. With all these languages there still is none that is full featured and fast executing and safe. So likely the new languages will continue to be created until someone finally nails it with an efficient, easy to use, safe, and full featured language. Human nature being what it is, even if we get that, people will still probably make more new ones ...
Last edited on
The hotshots who think they'll never make a mistake so they don't need "safety nets" ...

I agree, those who actually think they make no mistakes are wrong. As you said, mistakes are made.
But too many times I've heard people excuse sloppy coding with "The compiler makes it right."
The promise of "safety nets" creates lazyness and carelessness.
I'm speaking from experience.

... doing something the hard way when there was an easier, equally effective way.

That's my point.
If you know your stuff well enough, i.e. you learned every detail of it, there is no "hard way".
There's just "the way things are done".
To someone who only uses click-and-drag programmming even BASIC looks like doing it the hard way.
Python isn't easier than C++ - if you're well trained.

Second, regardless of how careful or how thoroughly someone tests, if something is error-prone an error will eventually slip through.

Yes, that's why the argument of "less errors thanks to a 'better language'" is moot.
The amount of errors that are made depends on the skill of the coder.
A poorly trained programmer makes more mistakes even with a "save" language than a well trained programmer in a "less save" language.

Besides, why is everybody so much against about learning?
When did that happen?

There's value in replacing a construct with a less error-prone one.

It is. However, without learning how to use it, you don't benefit from it. Again, it comes down to learning.
Last edited on
If you know your stuff well enough, i.e. you learned every detail of it, there is no "hard way".
There's just "the way things are done".
Even to an expert, there are things that are harder than others. Specifically, some things just take longer, or they're easier to screw up.

Yes, that's why the argument of "less errors thanks to a 'better language'" is moot.
The amount of errors that are made depends on the skill of the coder.
Again, nonsense.
All other things being equal, statically typed languages are less error-prone than dynamically typed languages. They just have to be, because a dynamic compiler performs no checks on the semantics, while a static compiler performs some checks. Take an expert JS programmer, give them TypeScript, and they'll produce less-broken code.

All the way down the scale of manual-vs-automatic checking you get to theorem provers, where the compiler proves that the code implements a formal specification. Such languages can produce literally bug-free code (although there's no guarantee that the specification is applicable to a real world problem).

Besides, why is everybody so much against about learning?
When did that happen?
To me personally, learning has no value in itself. Learning is only valuable to the extent that it results in a useful skill.
Are you saying that if there existed a pill that could instantly give you a skill that would otherwise take you several years to learn, you would not take it (assuming it has no side effects)?

It is. However, without learning how to use it, you don't benefit from it.
Maybe, maybe not.
Regardless, learning a safe tool and then using it is easy. Using carefully an easy-to-learn dangerous tool is difficult. It's a one-off cost vs. a continuous cost thing.
Python isn't easier than C++ - if you're well trained.

where do you get these ideas?
I am well trained in c++ and have only dabbled in python. It took me only a few min to connect to a database, query it, and print the results in python ... a task that takes a fair bit of effort in c++, including finding and installing libraries which python does in seconds without having to research what to use etc. In under a day, I re-created a complex hash algorithm in python that took a solid week to write in c++. It was much easier. It was also literally 10 times slower than the c++, but that wasn't the code, it was how pure python handles integer math (we are not allowed the extensions like sci-py or cython). I hate python, but it IS much easier to write.

The amount of errors that are made depends on the skill of the coder.

Also a strange idea. I would argue that # of lines of code = # of potential errors, and a language like python that did my database program in 10 lines is less error prone than the c++ one that had 4 times the # of lines of code. If equally skilled in both, the one with more LOC is going to have a higher risk of errors.

No one is against learning useful things. But a lazy programmer is the best kind... they get more done with less effort. There is no point in doing extra work, it just takes longer. Ill take one lazy, smart, skilled guy over 2 18 hour day by the book gigo programmers any day of the week.
... a task that takes a fair bit of effort in c++, including finding and installing libraries which python does in seconds

I was looking at the discussion purely from a language point of view.
Because the arguments about good or bad and better or worse are usually around the semantics and how the compiler handles expressions,
how the runtime handles resources and things like this.
When you bring libraries in, that's a different story. Python has a huge set of libs to work with, I'll give you that.

I would argue that # of lines of code = # of potential errors ... If equally skilled in both, the one with more LOC is going to have a higher risk of errors.

Hmmm, ok, that's a good point.
I used to ask myself why my professors taught such an old and outdated language like c++ instead of something more current. Have you ever tried building an app with c++? I have. I quickly abandoned it for kotlin.

How about a game with graphics? Yeah it's possible, but pygame is infinitely easier. And better.
Plus, things like recognizing a button click, accepting arrow key input, sorting... a beginner can't figure that stuff out in C++.

C#, java, kotlin, your heavy object languages, they were built from the ground up to handle issues like these. All you have to know are the right keywords.

But now, I think I understand why they teach C++. It's so we actually know how to code. We have a deeper understanding of how a computer operates with on and off switches. Using list.sort() in python, while dreadfully simple and powerful, teaches you nothing. You're pretty much just using code that someone else wrote that you don't understand. I refuse to believe anyone but the developers understand all the built in methods, objects, and classes in java, c#, kotlin, erc.

If you start off learning to use someone else's code, what are you supposed to do when it fails you? You don't know how to program, you know how to use wizards and framework. I think everyone should start with C or C++. I'm by no means a master at C++, y'all have seen my questions, but I understand programming enough thanks to C++ that I can get to this level with every language I've used within pretty much a day.

So, my point is, I think it's important for understanding computer logic and learning to code. After that, why not take advantage of the built in functions of other languages? They're so easy.
an old and outdated language like c++

*facepalm*

C++ may be old, but outdated? Not at all.
C#, java, kotlin, your heavy object languages, they were built from the ground up to handle issues like these. All you have to know are the right keywords.

was trying to do a LOT of small MD5 and CRC32 hashes of some data in real time (about 50 million of them per lot, 20 or so bytes each). This was to move some legacy code to a new platform; I know those are dated algorithms.
c++ baseline time taken.
java: 2.5 times slower.
python: approx 10 times slower
javascript: approx 2 times slower

problems: no unsigned type required extra code to force fit the bytes. Having to have objects where none was needed (java). Poorly written libraries (all 3) that I had to rewrite (else the speed was even WORSE). Also, using hand-cranked code where a library was available is considered poor form in the slow languages. General clunk in the languages, like unnecessary try catch blocks to handle errors that could not be thrown by my data. Clunky support for the static variable in a function concept, if it existed at all. Strongly typed problems, where the type of my variable would change and then be handled incorrectly in the slow languages.

The new shiny isnt always so shiny. Ive had positive experiences in those languages too, but they fail dramatically any time you want even half decent performance, let alone full real time / tweaked results. I didn't even thread the C++ (or the others, but I believe the javascript threaded itself for me).

Speed isnt everything. But I can't trust a slow language because invariably, if I become invested in it, someone will come up with a real time requirement that can't be met and I would have to change languages, starting the code base over from scratch.
Last edited on
>> Furry Guy

That was merely my mindset 2 semesters ago.
You've been badly MIS-educated, then.

Something new doesn't by necessity mean better. Might as well junk the internet since much of it is based on C and *nix.

AND IS OLD!
Last edited on
2 semesters ago was the start of my education. As in, the idea I had going in.
Education in C++ has changed my mind.
You know the entirely of what modern C++ has to offer? After 2 semesters?

I sure don't and I've been self-teaching myself since before 2004.

"Outdated" is very subjective. As in opinion.

Likely your instructor is teaching the subject badly. VERY badly.

I doubt they are teaching anything of substance from C++11, let alone C++14 or C++17.

The syllabus is probably outdated, likely C++03 or even C++98, with very strong roots in C. Because that is "the way it has always been done."

C++ has changed over the years, C++11 changed the language quite a lot. C++14 made some changes along with needed fixes. C++17 made enormous refinements. C++ is going to have more changes and additions with the official release of C++20.

Is C++ perfect? Hardly.

But then no computer language is.

If you are so "down" on C++ why hang out here at CPlusPlus?
You're still misunderstanding, and I'm unsure how else to say it.

Before starting school, I had no experience in programming.(Unless you consider html programming). Me thinking C++ was useless was something I believed based off what I had heard.

Then I started school.

I went in with that belief.

Then C++ Rocky 4'd my opinion of it through education and practice with it.

(as in, my mind was changed. I believed one way. Change happened. I now believe another.)

I really have no idea where you gathered that I know everything about C++. I implicitly said the opposite a few posts up.

The reason I hang out here is actually 2 reasons. One being that what you think I think about C++ isn't what I actually think, and the second being homework help.
Pages: 12