deque initialization fails sometimes

Hi!
The following program works correctly and without problems for some input values, for others it terminates prematurely and for still others it crashes with an error message.

The problem seems to be the initialization of the deque at the beginning of corr_dim() in line 108. The values of 0<r<5 and 0<x0<1 don't influence do also influence whether it crashes.

As examples, the code
a) terminates without error message after the output "No problem up to here.", for r=3.9, x0=0.4, n=500, skip=4, buffersize=400, exclusionsize = 40

b) works for r=4, x0=0.4, n=500, skip=4, buffersize=400, exclusionsize = 40

c) crashes at runtime with an exception (in ntdll.dll) after the output "No problem up to here." using the same values except n=600,

d) terminates without error message after the output "No problem up to here." using the same values except n=900

e) works for n=2 000 000, skip = 500, buffersize = 32767, exclusionsize=767 (these are realistic values for the program's application, the values under a) - d) are just for test purposes.


This occurs both in debug and release versions, I'm using gcc (32bit) on WinXP 64bit. I'm baffled and would be thankful for an explanation and solution!

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
#include "math.h"
#include "windows.h"
#include <iostream>
#include <string>
#include <fstream>
#include <cstdlib> // for NULL
#include <deque>
#include "convert.h"
using namespace std;

double* LogisticMap(double r,double x0, unsigned long n);
void corr_dim(double* data, int buffersize, int exclusionsize);

unsigned long n;
unsigned int skip, buffersize, exclusionsize;
double total[61] = {0};

int main()
{
	double r, x0;
	double* logisticdata;
	SYSTEMTIME st, et; //variables holding start and end time of calculation
	ifstream input_stream;
	ofstream output_stream;
	string ifilename( "inp.txt");
	string ofilename( "out.txt");

	cout<<"Enter parameter r: ";
	cin>>r;
	cout<<"Enter starting value x0: ";
	cin>>x0;
	cout<<"\nEnter number of datapoints to generate (eg. \"1000000\"): ";
	cin>>n;
	cout<<"Enter number of datapoints to skip initially: ";
	cin>>skip; //to be sure that we are on the attractor
	cout<<"Enter size of circular buffer: ";
	cin>>buffersize;
	cout<<"Enter number of points to exclude: ";
	cin>>exclusionsize; // excluding the most recent exclusionsize points
	cout<<"\n\n";

	ofilename = "out r = " + stringify(r) + ", n = " + stringify(n)+ ".txt";
	cout<<"Results will be saved in file " + ofilename + ".\n\n";

	cout<<"Iterating logistic map... ";
	logisticdata = LogisticMap(r, x0, n);
	cout<<"done.\n";

	cout<<"Starting to compute and bin distances...\n\n";
	GetLocalTime(&st); //get time before start of binning
	cout<<st.wHour<<":"<<st.wMinute<<":"<<st.wSecond<<"\n";
	/* call the routine that computes and bins distances skipping the specified
number of data points to be sure to be on the attractor */
	corr_dim( &logisticdata[skip-1], buffersize, exclusionsize);

	GetLocalTime(&et); //get end time
	cout<<"\nCalculation complete: "<<et.wHour<<":"<<et.wMinute<<":"<<et.wSecond<<"\n";

	output_stream.open(ofilename.c_str());
	output_stream<<"Estimate the correlation dimension of the logistic map.\n";
	output_stream<<"Written by Leander Hohmann, May 2009.\n\n";
	output_stream <<"Calculation started at: "<<st.wHour<<":"<<st.wMinute<<":"<<st.wSecond<<"\n";
	output_stream <<"Calculation completed at: "<<et.wHour<<":"<<et.wMinute<<":"<<et.wSecond<<"\n";
	output_stream <<"Parameters were:\n";
	output_stream <<"r = "<<r<<"\n";
	output_stream <<"x0 = "<<x0<<"\n";
	output_stream <<"n = "<<n<<"\n";
	output_stream <<"skip = "<<skip<<"\n";
	output_stream <<"buffer size: "<<buffersize<<"\n";
	output_stream <<"exclusion size: "<<exclusionsize<<"\n";
	output_stream <<"________________________\n";
	output_stream<<"\ntotal result of binning:\n";

	for(int i=0; i < 61; i++){
		output_stream << total[i] <<" ";
	};

	output_stream.close();
	return 0;
}

//takes param r, start value x0 and n and iterates the logistic map n times
double* LogisticMap(double r,double x0, unsigned long n)
{
	double* list = NULL;
	list = new double[n];
	list[0] = x0;
	unsigned long i=0;

	for(i = 0; i < n; i++){
		*(list + i + 1) = r * *(list+i) * (1 - *(list+i));
	};
	return(list);
}

/*
takes a set of values and computes and bins the distances between data points
in a circular buffer of size buffersize, ignoring the recent exclusionsize values.
*/
void corr_dim(double* data, int buffersize, int exclusionsize)
{
	double current, distance = 0;
	int it_buffer = 0, it_bins = 0;
	long it_data = 0;
	cout<<"No problem up to here.";
	deque<double> buffer;
	cout<<"But this does not work SOMETIMES.";

	const double bins[61] = {0,3.8242466280971355e-1,0.0015034391929775724,
............,
		0.0024787521766663585};

	//fill the buffer initially
	for(it_buffer = 0; it_buffer < buffersize; it_buffer++){
		buffer.push_back(data[it_buffer]) ;
	};

	//iterate over all given points, starting with the first point not in the buffer
	for(it_data = buffersize+1; it_data <= n - skip ; it_data++){
		current = buffer.back(); //set last value in buffer as current
		//compute distance to all values in buffer up to the excluded values
		for(it_buffer = 0; it_buffer <= buffersize - exclusionsize; it_buffer++){
			distance = pow(buffer[it_buffer] - current, 2);
			//increment count of all bins having larger upper boundary
			it_bins=60;
			while(distance < bins[it_bins]){
				total[it_bins]++;
				it_bins--;
			};
		};
		buffer.pop_front(); //delete oldest value in the buffer
		buffer.push_back(data[it_data]);//append next data point at buffer's end
	};

}
Last edited on
Why use a std::deque in one place and dynamically allocate a double array in another? Why note use std::deque or std::vector throughout and be consistent?

Also, your program doesn't have basic safety checks in place to ensure that the values used to access the containers are valid. With all of the calculations that you are doing I bet that at some point you are writing or reading outside the bounds of one of the arrays.

The corr_dim function doesn't make any sense. The buffer size has nothing to do with the size of the array being passed in. The corr_dim function has no idea how big the logisticdata array is. If your program exhibits undefined behavior, there will not seem like any rational reason for why it is crashing. You need to use the debugger and analyze your array sizes and monitor the operations where you are writing to and from the arrays. Line 108 is benign in and of itself. It simply creates and empty deque. I'm not sure why you think that the deque isn't being initialized correctly.
Thanks for the answer!

Why use a std::deque in one place and dynamically allocate a double array in another? Why note use std::deque or std::vector throughout and be consistent?


You are right with me being inconsistent and I apologize for the bad style of the program. Which double array do you mean, btw?

However:

Also, your program doesn't have basic safety checks in place to ensure that the values used to access the containers are valid. With all of the calculations that you are doing I bet that at some point you are writing or reading outside the bounds of one of the arrays.


This could be, however I haven't experienced that; the program works for sensible values and seems to work correctly, without crashes. The thing I don't understand is, that depending on what values one enters, the program seems to crash or quit prematurely in line 108 without ever running all the later calculations.

The corr_dim function doesn't make any sense. The buffer size has nothing to do with the size of the array being passed in. The corr_dim function has no idea how big the logisticdata array is. If your program exhibits undefined behavior, there will not seem like any rational reason for why it is crashing. You need to use the debugger and analyze your array sizes and monitor the operations where you are writing to and from the arrays. Line 108 is benign in and of itself. It simply creates and empty deque. I'm not sure why you think that the deque isn't being initialized correctly.


I'm sorry, but you got that wrong: yes, the buffer size has got nothing to do with the number of data (it's a circular buffer to evaluate only part of the logisticdata at once) and corr_dim seems to work correctly; as said before, the problem seems not to be with all the array accesses but something else, and I suspect something before line 108.
I haven't answered simply because I haven't had the time or inclination to analyze your 137 lines of code...

But kempofighter has it exactly right: you are crashing because you are trying to access out-of-bounds data.

Don't bother saying "the stl doesn't work" -- it is always a mistake to blame the tool over the user of the tool -- especially when the tool is an industry standard used by a lot of people and big name companies all over the world. If it were so blatantly broken, it wouldn't be so used.

Also, just because the debugger stopped you at a specific line does not mean that line is the location of the error -- it only means that is where the program was executing when the CPU/exception software managed to report the error.

So I'm sorry, but you got it wrong. Reconsider using some bounds checking on the mess you've written and you'll eliminate your error. There are libraries you can use to access runtime bounds-checking. You can also enable bounds-checking with the GCC if you are using that compiler.


I really don't want to be rude -- but sloppy code leads to exactly the kinds of errors you are having.
I seems to me my mistake is an out-of-bounds access in line 91. I think that's the only one.

I did not want to imply that I think the stl does not work, I just thought I had traced my error to that point. As I learned from your answers, out of bound errors cannot be traced the way I thought.

In any case, what would be a good way to clean up "the mess I've written"?
Topic archived. No new replies allowed.