while (time.data() && m < (time.size() - 1))
{
if (time[m] >= x && time[m] != time[m + 1])
{
x0 = time[m];
x1 = time[m + 1];
y0 = pos[0][m];
y1 = pos[0][m + 1];
y = y0 + (x - x0) * ((y1 - y0)/(x1 - x0));
itime.push_back(x);
ipos[0].push_back(y);
}
x = x + dx;
m++;
}
My problem is that dx = 0.04 and the data I am interpolating is timestep = ~0.05, which means that m reaches the end (~495 seconds) and time[i] ~= 335 seconds.