It's nice you stopped trolling and someone resorted back to arguments:
What does work is a multi-language approach: use functional languages or dynamic languages they way one does SQL. (Well, not *exactly* the way one uses SQL, but as a domain-specific language that helps get specific tasks done effeciently.) |
Scala supports exactly this kind of thing. It is many languages at once. E.g. You can have an SQL-like dialect in it, but still code imperatively if you wish.
BTW, Scala is good for domain-specific solutions, but it doesn't adequately adress the issue of multiple inheritance out of the box
http://www.qwantz.com/index.php?comic=1786 |
I disagree. Scala's multiple inheritance is the answer to the problems of virtual inheritance in C++. Python took a similar approach. Scala's designers learn from failures of others.
Because it's hard to predict exactly how long an operation (particularly a complex one) will take without knowing implementation details? |
This can be as well said about C++. This is a problem of all languages of higher-level than assembly. But in C++ it is even worse, because sometimes you cannot be sure even if you know the implementation - will the compiler apply RVO and eliminate the copying or not?
Functional languages have been around for much longer than imperative languages, and yet they are but mere curiosities in the language world. Very few significant programs are written in LISP and it has been around much longer than C++. Why do you suppose that is? |
The question relies on the two false premises:
1. That functional languages have been around for much longer; not true - the first computers have been programmed in assembly and FORTRAN,
then came LISP.
2. That functional languages are not used in significant projects. Actually, this might be true for pure functional, academic languages like Haskell. But there are plenty of functional-<something> hybrids like Ruby, OCaml, F#, Python, Scala, Erlang, R to name a few. The number of commercial, important projects done in these languages is high (actually Python's + Ruby's popularity seems to match C++'s popularity in open-source projects soon [1]). In some of them, these languages are used for mission-critical code, e.g. Erlang in the Open Telecommunications Framework or Scala in Twitter (yes, they chose Scala and not C++ despite the fact that performance is #1 on their requirements list). Also NASA considers using Scala or Python in their avionics software [2].
Many of the concepts that are researched in universities are first developed in some functional academic languages, and then some students take it and write a C or C++ programs basing on that algorithms (often as a homework) and then they bring it to the market. Doing research/prototyping in a language like C or C++ would be a waste of costly (human) resources.
They are perfect for "here's a one-liner in <functional language of choice>". Look how elegant that is. The response should be: here's a 100,000-line mission-critical C++ or Java program, with another 100,000 lines of unit tests. How would you do that in <functional language of choice>?
The truth is you cannot.
|
This one is easy. It is enough to show just
one counterexample. I'll show a few:
1. AutoCAD (LISP)
2. Emacs (LISP)
3. Open Telecommunication Framework (Erlang)
4. Yahoo Store (LISP)
5. LinkedIn (Scala + Java)
6. Twitter (Scala + Ruby)
7. CouchDB (Erlang)
The truth: the functional model is great for programming "in the small" |
It is exactly the opposite. How many 1M+ LOC systems have you built to support such (stupid) claim?
The fact is, although you can do large systems in imperative low level languages like C++, these systems easly break. Show me a large system that has availability of 99,9999999% (this means less than a second downtime per year) written in an imperative language with manual memory management. There is none. And Erlang can do that. C and C++ are good for writing small, embedded software or things were one crash per week is not a real problem - computer games, desktop applications etc.