that is certainly seconds and nanoseconds to micros, yes.
if it seems wrong, check the input units and that you don't have an integer or anything in there.
todays machines can run a LOT in a microsec. If you get all zeros, try a sleep for a specific time and see if you get the right answer.
what is the point of timespec?
a 64 bit int holds about 600 billion years worth of seconds, or 6 billion years worth of ms. If we continue to use 1970, you could certainly turn that into pico's or something and still have all we need until well past the future where 256 bit integers are the norm.
I seem to have overlooked the type, and looking now, I see no use for it at all.
I have seen another code which considered only the nano seconds part, so I get confused should I consider both in measuring the running time or only the nano seconds part?
it depends on how long it runs. Does your program run for more than a second?
To put it in perspective, you can sort over a billion numbers** in a second on a reasonably up to date PC.
here is what I use for most functions. you can get a duration instead of just a difference, but the difference is good enough for me usually (my goal usually is to reduce the value, the units don't matter).
auto start = high_resolution_clock::now();
code to time();
auto end = high_resolution_clock::now();
cout << (end - start).count()<< "\n";
the duration looks like this (whatever units -- it supports a lot)
chrono::duration_cast<chrono::seconds>(end - start).count()
**(multi threaded). ST is more like 10M in a second
It's not correct because it's the wrong type. 1e6 and 1e-3 are doubles.
Regarding timing something, I suggest holding the start/stop values, find the difference, then convert to microsec. If you're using integral types (variants of int), the order won't matter, but using doubles will.
If you want to find the difference between two timespec values, why not code it?
it depends.
lets break it down..
say you have a function that takes about 10 seconds.
you can print that it took 10.123456 seconds.
you can print that it took 10123.456 ms.
whatever you want to see on the screen, in other words.
I don't know how the weird time thing you have works -- but if it is split into seconds and subseconds as two fields, then you need the seconds field. Again, I would do away with all that and just use high res clock as I showed you, and then cast it to the output you want and call it a day. I don't know if its from 1970 or not but if you subtract two of them in a difference, it does not matter... all that is cleared out and you just get a few seconds and subseconds left. The extra 40something years part 'cancels out'.
my understanding is that it is seconds+nanos, because it says "broken down into" and not "your choice of" or something like that. Its probably faster to play with it than unravel the documentation and examples. Just throw a little program with a sleep in it and see what you get when you sleep for 5 seconds.
linux has nothing to do with which function you choose to use here. Everything we talked about is standard c++ as far as I saw, its just choices.
#include <iostream>
#include <time.h>
#include <unistd.h>
#include <chrono>
usingnamespace std;
usingnamespace std::chrono;
/*******************************************************************************/
//Function Name: clock_gettime and diff
//Inputs:
//Output:
//Description: these two functions are responsible to calculate the excution time
/*********************************************************************************/
int clock_gettime(clockid_t clk_id, struct timespec *tp);
timespec diff(timespec start, timespec end)
{
timespec temp;
if ((end.tv_nsec-start.tv_nsec)<0) {
temp.tv_sec = end.tv_sec-start.tv_sec-1;
temp.tv_nsec = 1000000000+end.tv_nsec-start.tv_nsec;
} else {
temp.tv_sec = end.tv_sec-start.tv_sec;
temp.tv_nsec = end.tv_nsec-start.tv_nsec;
}
return temp;
}
int main () {
timespec time1, time2, time3;
clock_gettime(CLOCK_PROCESS_CPUTIME_ID, &time2);
sleep(5);// sleep for 5 seconds
clock_gettime(CLOCK_PROCESS_CPUTIME_ID, &time3);
time1=diff(time2,time3);
cout<< "seconds part is "<< time1.tv_sec <<endl;
cout<< "nano-seconds part is "<<time1.tv_nsec <<endl;
cout<< "converting total to micro-seconds"<<endl;
// convert time to microseconds
cout << (time1.tv_sec)*1e6 + (time1.tv_nsec)*1e-3<< endl;
return 0;
}
The output is:
seconds part is 0
nano-seconds part is 36339
converting total to micro-seconds
36.339
As seen:
I sleep for 5 seconds, but the second part is 0
36 microseconds is not much like 5 seconds :)
I would dig into it until you get something that approximates 5000.
did it seem to sleep for the 5 seconds you asked for?
I hate to keep saying this, but you are fighting yourself over something that should be a couple of lines of code, and for no reason I can see....
1 2 3 4 5 6 7 8 9 10 11
int main()
{
auto start = high_resolution_clock::now();
std::this_thread::sleep_for (std::chrono::seconds(5));
auto end = high_resolution_clock::now();
cout << chrono::duration_cast<chrono::microseconds>(end - start).count();
return 0;
}
output (varies per run a little)
5014873