Static vs Dynamic Linking

Hey everyone!

I wanted to get your thoughts regarding static vs dynamic linking. Basically, when should I choose one over the other?

Thank you.
closed account (E0p9LyTq)
Dynamic linking is great if you want the size of your executable to be as small as possible, with the possibility of having to include OS and third party library files when you distribute your program.

Static linking packs everything into your executable so all you theoretically need to give someone is the file itself.

The more libraries you use, the larger your executable can get.

I usually use dynamic linking when creating apps and programs just for myself.

This applies to Windows, since that is my OS. I don't know how MacOS or Linux does it.
Last edited on
Thank you FurryGuy! That's pretty much what I managed to gather through independent research as well but needed someone to confirm it for me. Dynamic Linking seems like a good way to go because all the linking happens at run-time and your .exe remains small in terms of size.

Of course, I will leave this topic open so if anyone wants to add more to this feel free to do so!
The choice often comes down to what you expect your users to have to install.

If you are using libraries that are reasonable to keep as shared DLLs on your user’s computer, then dynamically link.


For example, right now I’m writing code that plays with image file formats. Things like PNG and JPEG are common libraries, and I can expect my users to have them already installed. If not, I can provide them in a separate download.

Things like Exif handling is not likely to be a common library. I can either statically link it or just supply a DLL that sits next to my executable in its install location.


Typically, people will statically link applications to make them easily distributable and, especially, usable without having to install it in some way. A single exe works wherever you put it —in your downloads directory or on a USB stick, it doesn’t matter.

Applications with even one non-system DLL dependency become less anti-install friendly.


Hope this helps. In addition to FurryGuy’s advice.


P.S. Linux and OS X do it the same way. The only real difference is how your user gets and install DLLs (called SOs on those systems — for shared objects). If your application can be properly wrapped in a package, then the package manager will extract and put everything where it belongs, including SOs and your executable.

Mac systems do have some wrinkles, such as dylibs, that I don't really know how to install.

Just to add some more to the mix.

DLL's will only be loaded once by the OS. So if something else is already using the DLL, your program gets a free speed-up by not having to load the DLL before using it.

By being shared, this also reduces the total overall memory footprint of all the programs in memory.

A downside of DLL's is https://en.wikipedia.org/wiki/DLL_injection
You might not entirely be sure that something isn't modifying or intercepting what your program is doing.
Programs that might need a high level of security might want to statically link.

A pro/con (at the same time)
P - you get free security updates whenever a common DLL is updated.
C - you have to rely on 3rd parties to make sure the system is patched up to date.

For my personal use, I prefer dynamic linking. I know that when I make a change to a DLL, all programs that use that DLL automatically get that change.

However, one perspective that has not been mentioned is business critical production systems.
My company's standards (on the platform I work on) for production systems is static linking (excludes OS DLLs). The rationale being that a business critical production program can not be affected by another program making a breaking change to a shared DLL. The version of the program that was QA'ed remains untouched in production. The one exception to this is where the need for a particular DLL can only be determined at run time. This is very rare in our production environment.

Edit: Added "in our production environment" to statement about rarity of dynamically (explicitly) loaded DLLs.
Last edited on
A few notes.

Shared libraries only make your program smaller if the library is already on the computer. Otherwise the overall size of the "executable file" is still the same, it's just distributed among different files.

The big advantage (and I think the original motivation) behind shared libraries is that you can have one copy of the library loaded in memory and share it among several different programs. If you're running several copies of the same program then there is no benefit because the processes all share the same code image anyway. Think of C library: every C and C++ program running on the computer can share the same copy of the library code. It's a big savings in RAM.

We have the same issue as AbstractionAnon at work: business critical software where we don't want a problem in a shared library to affect a dozen programs that use it. So we do static linking to our own libraries with most of our programs.

There are probably differences in compile time (link time really) and program load time. If either of these matter to you them measure it both ways to see if it's important.
The one exception to this is where the need for a particular DLL can only be determined at run time. This is very rare.

Plugins or optional features.

The normal dynamic linking is implicit. The loading of the executable automatically loads the linked libraries or fails on lack of them. All you need to do is to include headers of the library and tell the linker where the library is.

For explicit dynamic linking you have to write code that attempts to load a library and map symbols (functions). If successful, the program can use the library.

The program could, for example, check whether the system has a lib for PNG and then offer additional features, if so.


Even the monolithic statically linked binary has to interact with the OS somehow. It is possible that the changes in OS over time break old, otherwise self-sufficient binary. I've seen that happen.
(Were the once common Installer tools statically linked? They were 16-bit binaries. 64bit Windows did drop support for them.)
Heh, I am very impressed by the amount of expertise brought out in this thread. You asked an excellent question, Angela1998!

This thread is the kind of stuff I wish could get stickied somewhere.
@keskiverto:
When I referred to dynamically (explicitly) loaded DLLs as being rare I was referring to our production environment. I edited my post to better reflect that.
Last edited on
For Angela1998’s benefit, there are a lot of confusing terms to deal with when handling libraries. She probably has a pretty good handle on the important ones, but to be complete:

A statically linked library is one where the library code is compiled as part of your final executable.

A dynamically linked library is on where the library code is placed in a DLL (.dll or .so [or other system-specific whatever]).

The DLL itself can be statically bound (or early bound) or dynamically bound (or late bound). Feel free to substitute the word “loaded” for “bound” at any time.


When the code is statically linked to your executable, it is immediately available for use because it is part of the executable.

When the code is dynamically-linked, early bound, it means that your executable contains information that the operating system uses to load and link your executable to the code in the DLL. This happens before your program begins execution. If the OS cannot perform the task, either because the DLL is missing, or because it doesn’t match the executable, then the OS aborts the process and tells you that the DLL is missing or corrupt.

When the code is dynamically-linked, late bound, it means that the OS doesn’t know anything about your requirements for the DLL. That is, your program doesn’t actually need the DLL to begin executing. At some point, your program will explicitly ask, in code, for the OS to load the DLL, and then your program will explicitly ask for access to symbols in the DLL (such as functions that it can call).

This last type of library is often used for things like plug-ins and managing application updates.

Whew. That’s enough for now.
Last edited on
the terms are slightly different on unix, though. But the ideas are the same.

generally small programs, utilities, etc are static, most large projects are dynamic. Dynamic has a great many advantages over static. Static has only 1 real advantage... its all in one file so easy to move around. In some edge cases static may be faster, but as the program file grows, dynamic becomes faster, its not easy to second guess where that crossover will happen. Certainly look at it once the program's footprint exceeds a page of memory if you care.
Last edited on
Theoretically, can't static linking allow for more link-time optimization (LTO)?
Because a compiler certainly can't optimize a dynamically loaded function call, right?

(I really don't know how that works.)
Last edited on
Yes.

In fact, that is where LTO is performed, at the static-link stage.

(I had to look this up, because I don’t care.)
(Though, I’m pretty sure I’ve read about it before.)
I personally would like a lot if many Linux programs went distributed as static. I do not care if it occupies more memory, because the effort it can be to find all the required packages, is fat beyond what normal words can express. Plus in many cases the extra memory consumption or initialization time borderlines irrelevance.
Last edited on
I personally would like a lot if many Linux programs went distributed as static. I do not care if it occupies more memory, because the effort it can be to find all the required packages

What??

That is more about how software is distributed than about linking.

The traditional way to get/install application is:
download sources
./configure ; make ; make install

That is, to get source code and compile yourself. The 'configure' step will inform about lacking dependencies. Yes, the "What provides XYZ?" could be an issue.

That is why common Linux distributions provide applications as (pre-compiled) packages with metadata on dependencies and store the packages in consistent repositories. A package manager utility can query the dependencies and fetch&install all necessary packages for you.

If there is a packaged version available for your distro, then use it.

Third party developers can be lazy at packaging for there are many package formats. Containers have become an OS-agnostic alternative; they can contain every necessary library.
Yes, I am well aware of how to compile linux source code. The main problem is that not all the packages are available to your distro, and less and less packages will be as time goes on and your distro's support is dropped by. And, to say "upgrade your distro" is not a serious solution. Mainly because the opposite happens, old packages will not be supported in a newer distro. And to say "just use a newer version" is not serious solution, because that's basically forcing someone to learn to use a newer version possibly due to the whim of developer.

Not to mention when one cannot upgrade due to hardware issues. Or that may not be a newer version. Or that newer versions may not be capable of doing what the older version did. And yes I realize it's not a Linux exclusive thing and that its severity is dependent on the distro that your use, but generally Linux as a whole does not have serious considerations for backwards compatibility.

Great amount of the open source software does not care for that issue, not including the required libraries or the required source code. And a that might a hassle of unfathomable proportions. Because when someone decides to undergo all the work of fetching, compiling and installing all the required stuff, if the fetched packages are not the ones developers used, you have great chances of compiling failing because it will not find in a expected function or something else. Or that you already have a installed version, forcing one to mess up with system environment variables to fix it.

And static linking all the dependencies can be a solution, albeit a poor one that consumes too much memory and just do not work on all the cases, since changes on low level libraries or interfaces can make static linked binaries on newer distros useless on older versions. The best would to bundle everything necessary, even

Closed source software cares much more with this issue IMO, and tries to package all the required libraries on it, and many times this avoid the issues, but issues with GlibC and libstc++ still happens.

Said that, static linking could do the job in many cases, even being the poorer solution in many of these cases.
Topic archived. No new replies allowed.