What happened on the 1st January 1970? Who cares? What about 2000? Again, who cares? None of these epochs make sense, they're all so arbitrary. Nothing of note happened on those days.
The most logical epoch is one that we have hitherto been unable to represent on most computers due to its size. But with the advent of 64-bit computers, which are fast superseding the older 32-bit processors, it's now possible to represent time with the most logical and least arbitrary epoch possible: the beginning of time itself.
Introducing Universal Epoch Timeā¢.
With 64 bits, it is possible to store a number up to 2
64-1, which is exactly 18,446,744,073,709,551,615. Interpreting this as a count of seconds, it is therefore possible to represent up to 584,526,042,639 years. The age of the universe is estimated to be about 13.7 billion years, so using unsigned 64-bit integers we can represent the current age of the universe, plus about 570 billion years, which is far longer than the human race (let alone universe) is expected to last.
Thus, unless humanity literally ascends to godhood, there will never be another Y2K or UNIX-time bug again. (And if we had ascended to godhood, we could probably fix the bug quite easily).
Conversions (in C for compatibility):
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22
|
#include <limits.h>
#include <stdint.h>
#include <time.h>
typedef uint64_t uet_t;
enum {
UET_TO_UNIX = 432000000946700000ULL
};
uet_t unix_to_uet(time_t unix_time)
{
return (uet_t)unix_time + UET_TO_UNIX;
}
time_t uet_to_unix(uet_t uet)
{
static const uet_t max = (uet_t)TIME_MAX;
const uet_t adjusted = uet - UET_TO_UNIX;
assert(adjusted <= max);
return (time_t)adjusted;
}
|