Static linking. GCC was statically linking the libraries that do memory allocation, I/O, etc., while the other compiler wasn't.
Don't worry, this only really hurts with small programs. For larger programs, the difference is negligible because the code being compiled is much larger.
I was compiling the helloworld program using TDM MinGW, but compiling the MPI application using g++ in a linux vm. Same compiler, different platform...I should have know the code size for the gnu compiler in linux would be smaller than in Windoze.
I recompiled the helloworld program using the VM's compiler, and it shrunk down to 9kB
What a difference!
So I take it that MinGW links statically, while a 'pure' gcc links dynamically? Why the difference between the two? Can I tell MinGW to link dynamically?
Linux executables typically link dynamically with shared libs known to be on the system.
Windows executables typically link statically with librarys when it is not known that they will be on the system. You can do the same thing though: tell MinGW to use dynamic libraries -- only you must make sure your users have all the DLLs it uses on their system before your program will work.
To link against DLLs, you need to use the -Bdynamic flag at the command line.
You can also use the shared libg++ by compiling all your modules with _GLIBCXX_DLL #defined and with the -lstdc++_s flag at the command line.
Thanks for the enlightenment on the subject of dynamic vs static libraries Duoas, much appreciated.
...Why? How does that make enough sense to be predictable?
Because MinGW has always been an afterthought of the GCC project. They don't even bother to package a Windows Installer Package anymore, unless you want to install the really old (and bad IMO) 3.4.5 version. Anyone looking for an easy way to get it working in Windows usually just downloads the TDM version, but it's poorly supported, making it a less attractive alternative to the official binaries. I'd rather install an Ubuntu VM and use GCC from a tty in that then try and fight with the half-dozen or so sets of tarballs to get it working properly in Windows.
Uh... The last time I had to download MinGW (4.4), I just downloaded one or two tarballs and had to run a batch, I believe. Wow, that was mind-blowingly hard.
It's also a one time chore, since you can later just compress the directory and take it everywhere with you. For example, I have a 7z that can be extracted and when the batch file is ran, it decompresses an MSYS and MinGW installation and compiles ffmpeg from source.
Just this morning, I tried to install GCC 4.5, I followed the instruction to the letter and when I tried to compile a simple helloworld program, it complained about not being able to find standard library headers. Thinking I might have missed a tarball, I double checked the instructions on the website...nope had em all. Smoked the directory and redownloaded, nope still doesn't work.
Oh well, why should I be bothered to fight with it if they can't even bother to package it properly.
Wow, that was mind-blowingly hard.
...
Yep. Really hard to get working.
Why the sarcasm? No need to be hostile...If I mistakenly hit a softspot by ragging on GCC I apologize but I really don't see how....
Sorry, that's my bad, I'll be the first that'll admit that I CAN be a bit sensitive at times. Good news though, gave gcc 4.5 another try finally got it to build something. 23kB for a helloworld program in Windows, that's compared to 68kB for the same program compiled with VS2010 and a far cry from 850kB from gcc 4.4.1.
Now, THAT's progress...they still need a better packaging system...I had to untar about 16 tarballs...but at least it works now.