I am using Magick+ to produce frames for a game engine. I invested time in switching from GDI for a performance increase but I ended up getting the opposite; not only is it much slower but it chews up more ten times the memory (60 mb vs 800mb).
This is an engine for running Sierra AGI games from 30 years ago, something doesn't seem right here!
I figured that the culprit was probably the way in which I used the library, not the library.
I am going to show the important parts of my code to the experts here and hope for some tips to help me improve.
Note I am an experienced C# dev but only a beginner C++ dev.
Secondly I would love to remove starter image, and just generate one with a fill colour, but I don't know how to specify an image format if I do so.
I am compiling a C++ project against the CLI in mixed mode.
Wrapper.h
1 2 3 4 5 6 7
namespace CLIScumm {
public ref class Wrapper {
...
private:
...
Magick::Blob* magicKBlob;
};
ImageMagik is for processing things like photos.
So what it's going to be doing is storing your images in floating point RGBA format, like maybe 64 bytes per pixel.
Very good for maintaining image fidelity in a photo.
But totally useless for real time frame rates from computer generated imagery.