People still use Fortran because there's a metric ton of code that is written in it and it's more cost-effective to just maintain instead of rewriting it, Fortran compilers have also had substantially more time to develop robust optimizations. For *pure* number crunching, you can't really beat Fortran (except for maybe straight-up ASM, but writing in Fortran is more portable).
Time isn't necessarily continuous. Just because the graph would be continuously increasing doesn't mean it's not finite (the Big Bang was a clear starting point; we've had a topic on the possible ending points too).
Fortran was the first compiled language (I think).
It's an abbreviation of "Formula Translation".
The compiler translated high-level formulas into low-level code.
Back in the day, this was a revolutionary concept.
Now you just need to actually define the function (there's not just one function of time :O) and define the 0 point (epoch, the death of ludwig XIV, the last time you ate a cheeseburger?).
They say time is a human creation, no other species recognizes time. Just day and night, and the four seasons. Does this mean time starts and ends with human life? Or does it just cease to be recognized?