Hi guys, this is my first post. I'm studying the dat files where adobe stores resources in the xfl open project. Surfing in the web I found some interesting explanations.
A 1x1 px png with red background is compressed in the following way:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15
0305 ;raw bitmap identifier?
0400 ;length of decompressed row data
0100 ;width
0100 ;height
0000 0000 ;unknown
1400 0000 ;width in twips
0000 0000 ;unknown
1400 0000 ;height in twips
00 ;transparency flag - (00 = no transparency - 01 = transparency)
01 ;compressed data flag(00 = no compressed - 01 = compressed)
0200 ;length of compressed chunk
7801 ;compressed chunk
0A00 ;length of compressed chunk
FBFF 9F81 0100 06FD 01FF ;compressed chunk
0000 ;end of compressed stream
Someone said that this is a zlib compression, but I do not understand if all this byte stream or part of it is compressed by that libraries. For example the width is 1px and should be 0001 but it is 0100. Why?
Does any one know how can i convert this file to his original png format or have more information about this kind of file? I should do that in c++. Thank you!
Thank you vin for your reply.
I read something about endianess after your reply and as is 0100 instead 1000 "little endian" method is used. Correct?
This means that all byte are swapped?
For example:
is the first 0503?
and in width in twips case?
is it 0000 0014 or 0014 0000? is this a DWORD?
I am just starting with byte studies but all of that is very interesting.
Endianess only of concern for multibyte values and not single bytes.
Re width and height above.
Written on disk as 01 00 i.e. assume two bytes.
In little endian the little value byte (least significant byte) is written first so to interpret the value you think 00 01 which has decimal value 1 if you convert the hex to decimal.
Thanks to your suggestions I have converted the file and I discovered other infos:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15
0503
0004 4 ?
0001 1 width in px
0001 1 height in px
0000 0000
0000 0014 20 it is just width dimension x 20
0000 0000
0000 0014 20 it is just height dimension x 20
00 0 boolean transparency
01 1 boolean compression
0002 2 there is only one byte so 2 hex digits
0178 376 it looks like always the same
000A 10 there are 5 bytes so 10 hex digits
FFFB 819F 0001 FD06 FF01
0000
The other two dimensions are not twips values, because 1px = 15tw, so they are just the pixels values * 20.
0178 is always the same in all images that I created, so it could be a compression code or something like that.
Now I have to find an algorithm to convert compressed bytes to pixel.
Any idea?
0503
0004 4 ?
0001 1 width in px
0001 1 height in px
0000 0000
0000 0014 20 it is just width dimension x 20
0000 0000
0000 0014 20 it is just height dimension x 20
00 0 boolean transparency
01 1 boolean compression
0002 2 // there are two bytes following
78 01 // zlib header
000A 10 // there are 10 bytes following
FB FF 9F 81 01 00 06 FD 01 FF // zlib compressed pixel data
0000
Python is probably the easiest way to investigate this format and then write your program in C++.
I tried to zlib decompress just the compressed data but wasn't successful. Got a zlib error.
import struct # for evaluating binary data
import zlib # to use zlib
infile = open("c:/xfl.dat","rb") #open binary file for reading
f.read(2) #read two bytes
f.read(2) #read two bytes
width, = struct.unpack("H",f.read(2)) # read two bytes as unsigned two byte int
height, = struct.unpack("H",f.read(2)) # read two bytes as unsigned two byte int
.... #read file
length2, = struct.unpack("H",f.read(2)) # read length of second compressed chunk
compressed_data = f.read(length2) # read compressed data
uncompressed_data = zlib.decompress(compressed_data) #failed!
f.close()
Edit:
OK I figured it out.
The two byte compressed chunk is the header needed for zlib.
So the data to uncompress is: (Note data is not on disk in little endian because it is not a multibytes value but a sequence of single bytes)
Thank you very much for your help. I will start with pyton to understand if my project can be developed with this technology. Congratulations for this forum and in particular for your knowledge, in only one day I get all the informations that I need to start.