convert 1D array to 2D

This sounds like a problem for general at first, but not necessarily.

Anyways, you have an image of width X and height Y. the image is stored as a 1D array of pixels of size N. for sake of reason lets say X is 15 and Y is 9, therefore N is equal to 135.

Given index I of the array I want to know it's x and y coordinate. It should be noted that the image has (0,0) as index 0 and (15,9)as index 135

How on earth do I solve this? It's driving me nuts. I worked out that i = cX + (cY* width), i being the index, cX being it's x coord, cY being the y coord, and width being the width of the image...but I'm stuck.

EDIT:
It may help to know what I'm doing...

I'm writing a function that calculates the x and y coordinate of every pixel in a circle given radius and the center.

the pseudo code was easy to write.

1
2
3
find parametric point of every point in perimeter of radius i
reduce i by 1
if i is more than 0 repeat. 


The problem arose with parametric point processing.

the set of equations for finding X,Y coordinate of a position on the perimeter are as follows

X = cx + r cos(a) where cx is the center of the circles x coord
Y = cy + r sin(a) where cy is the center of the circles y coord

I need the center x and y to do this... and all i have is the index on the array.


Last edited on
closed account (z05DSL3A)
i % width and i / width?
Thank you kind sir :D I was completely over complicating it... Thank you, also 3:40 AM + not sleeping for 2 days prior + studying for finals != an intuitive Serapahimsan
Topic archived. No new replies allowed.