Conjugate gradient using MPI

Hello everyone.
I wrote a C++ MPI code for solving Ax=b equation using conjugate gradient method. Also I solved this equation using serial code in C++. The results are the same in every iteration for both parallel and series algorithm but when I print down the result in MPI format I get non(ind) for x while in that iteration x is not ind. I am attaching my code and I would be grateful if you could take a look at it and let me know what is the problem.


[
Last edited on
1
2
3
	double** a = new double*[N_DIM];
	for (int i = 0; i < N_DIM; i++)
		a[i] = new double[N_DIM]


If you must use arrays (std::vectors would be much easier to work with), why dynamically allocate them? Why not:

double a[N_DIM][N_DIM];


I'm not familiar with the MPI_ code you are using. But you are using 'a' as a double* in line 143 (and probably elsewhere). An array of pointers to arrays is not the same memory layout as a 2-dimensional array. So, if your MPI_ library is expecting a to be a contiguous block of memory containing a 2-dimensional array of doubles, it will fail.
Thanks for your answer, I changed this issue before your reply, but unfortunately it doesn't make any difference.
If my_rank isn't zero, I don't see any code that would generate any output. Could that be the cause?
If my_rank isn't zero, I don't see any code that would generate any output.


If my_rank isn't zero then it wouldn't be a good idea to do any output: you aren't the root processor.

@resabzr,
There is NOTHING in your main routine that currently needs to use the MPI system. Just get that bit running in serial first, rather than trying to run 278 lines of code. If the root processor can't print the array then you haven't a hope.


Anything meaningful in main() is done under the aegis of
if (myrank == 0)
In other words ... only the root processor knows anything about it.


In routine VxV() you have
1
2
3
4
if (myrank == 0)
	{
...
		if (myrank > 1)

Nothing, but nothing, is going to happen there.


Start small ... and start IN SERIAL.

Last edited on
Topic archived. No new replies allowed.