The problem is that so much software looks for the programs without the version numbers and doesn't know what to do when you have version numbers in the filename (which is an ugly hack in and of itself). Then you have to do even uglier hacks like symlinking the program name without the version to the version you want to use. |
A few things here: The beauty of the package manager, is that it knows what an applications dependencies are, and will find and install whatever is needed to make it work. This usually means libraries. It is especially aware of versions in this process.
For actual executables, the application installer has a shell script which determines whether a particular executable or subsystem (a compiler say) exists. If it doesn't, the installer can install them as well.
Some things aren't named with a version number. For example boost is in /usr/include/boost and that is the current version. It is a massive duplication to want to keep a previous version of boost - the diff is probably fairly minimal compared to the size of the whole thing. One can always have a back up of it though.
You misunderstand, I am talking about libraries like SDL, SFML, FreeType |
I made that point to contrast what happens with personal programs as opposed to actual software like the ones you mentioned.
where do you install those to? |
I personally put them in /opt those directories have files the app needs, it finds the libraries in the standard place, as mentioned earlier.
What do you pass to -DCMAKE_INSTALL_PREFIX? |
Well those options exist to allow customisation of where things a put, but often that is unnecessary, and often make and install are done in the directory where the software is installed.
Then what about when you build other things that depend on them? By default they assume you installed it globally mixed in with everything else. |
The install script will go looking for what it needs, otherwise there are the switches to specify where things are. To avoid that, it's easier to go with default locations, I wouldn't put library files in some other non standard place, unless I specifically wanted to keep it separate.
That's not what I called organized. |
Well, you perhaps need to look at it differently. You are kind of looking at it from a "All the stuff I ever need is in my room", whereas in a shared house, kitchen, bathroom (not always), TV room , office etc are shared spaces. Your view sounds like each of the 4 household members should have their own self contained 1 room apartments, in the house.
Why should someone else's idea of 'organized' be forced upon me? Why did the GoboLinux people have to completely modify the OS just to avoid that default? |
Each to their own, I guess. Remember UNIX has been used by large organisations and many individuals through Linux for 40 odd years, so all these people have figured out that it works. If some group wants to organise it their way that's up to them.
You assume that a single application is made up of only one executable. |
In /usr/bin there is exactly 1 executable for each software package, and they are all quite small - about 500 - 800 KB. That executable knows where to find other files in the directory where the software is installed, and library files in their standard location. Often they are links to some other executable in the installed directory. When or if I need to make a short-cut on the desktop, I go looking in /usr/bin not in multiple places, system files are organised like that too, so they can be found by scripts.
Why not group the executables, libraries, and headers into the same place? Yes, that's not the way it currently is, but can you explain why the way it is now is better than any alternatives? |
Because they are all optionally shared. Linux does not have the fiasco of lack of support for version dependency, because it has package managers that know all about versions and can find dependencies.
DLL sharing isn't really recommended these days |
See my last point.
I'm not trying to convince you otherwise, I'm just saying that you shouldn't take for granted that what you are used to is the best possible way. Some people actually go out and defend the registry or the *nix directory layout, rather than accepting that it's just a historical decision we've never changed. |
Well that goes both ways, doesn't it? You seem frustrated here about the UNIX system, but elsewhere complain about the difficulty of linking custom builds on Windows. So I am not seeing much acceptance / agnosticism on your part.
On my side (I am probably just as less agnostic as you seem to be), when I use windows as an ordinary user, I use the software and don't worry about anything. I did try a bit of developing of small things on Windows, but I found the free versions of VS just annoying. Maybe it was the lack of standard compliance, and the sense of being corralled towards .NET or C#. I was trying write code for AutoCAD, which seemed rather awkward to do with C++.
Another way to look at things: There are auto transmissions and manual transmissions, if we metaphorically assign those to Windows and UNIX, then realise that there is also adaptive auto transmissions, which allow one to select a gear, otherwise the transmission will do things like change down gears automatically as the car slows, rather than staying in a high gear like an old auto would do. Then we could ask how close is each OS to the adaptive auto?
I don't have a very strong aversion to *nix, it's just that there is very little reason for me to pick between *nix and Windows and the directory layout ended up being the deciding factor. |
I think there is a strong reason for you to try Linux - The relative ease of Building & Linking & Installing, which you seem unhappy with at the moment. The balancing factor might be education - there is a strong learning curve for Linux, it is a different paradigm. I am sure a smart young man like yourself could take to that like a duck to water - given your already substantial experience.
So why not take the plunge - build something on Linux. See how you get on.
Also, try building something with Qt, then use that same code base to target another OS.
Yeah, not sure what that has to do with the discussion though? On that note though, since it's pretty clear that people don't update systems I really don't understand the fear of breaking reverse compatibility. It seems like the only way we fix the mistakes of the past is by creating entirely new systems with mistakes of their own instead of just fixing what we already have.
|
I mentioned that because you might be faced with applying for a job that does involve continually patching up a nightmare app. And despite having a degree, experience is gained on the job.
Sometimes consultancies prefer to have 10 people with essentially life time jobs on an essentially almost broken system, rather than propose to spend 12 months to have a brand new system. Fixing the old system to make it equivalent to a new one is impossible, because of it's poor design. Management also seem reluctant to start afresh sometimes. When they do agree to start again, the new system has problems of it's own, as you said.
I read the link you posted, seemed reasonable enough. Noted that the replies all seemed to bag the windows folder names. But that is probably related to the vast majority of users being Windows users.
Anyway, a good discussion IMO :+)
Regards