Hey everyone!
I normally don't cross-post, but I would really like to see this issue fixed and I feel like my StackOverflow question isn't really getting anywhere. Here's my problem:
In my current workflow I am exporting a 3D model along with various information (vertex colors, UV coordinates e.g.) from Blender through Python. This is stored in a binary file which can be read in C++.
Consider the following code:
Python:
1 2 3
|
ar = [0, 1, 2, 3, 4]
with open('model.bin', 'wb') as file:
file.write(struct.pack('%si' & len(ar), *ar))
|
C++:
1 2 3 4
|
u32* ar = (u32*) model_bin; //model_bin is already declared externally
for(int i=0; i<5; i++){
printf("%d\n", ar[i]);
}
|
This works! However, my file contains more than just one list of data. It contains a lot of data of different types. I keep track of this with the help of different pointers at the start of each new "element". I.e.:
rgb* vcols = (rgb*) model_bin;
u32* uvco = (u32*) ((char*) vcols + 2*numVcols)
Here is the full "stack":
http://www.pasteall.org/51026/c
Right now, everything checks out, except for the first value of my UV textures. Setting it to 0 in my export script makes my C++ application read it as -566558720.
If I export and read the UV coordinate numbers as shorts however, everything works fine.
But it gets better! When I set the value of my last vcol[] element to 0, I can read the written int just fine! Setting the final vcol to 1 makes the first uvco read out as 65578.
I realize this is very tricky to debug without seeing the actual code and such, but perhaps someone can lend me some insights or suggestions based on this information?
Thanks,
Patrick