switching from linux to microsoft

Pages: 123
Since 2008 I've been using a Linux operating system at home. I recently had to retire my old laptop so I decided to try Microsoft system again. I've always had to use Microsoft at work, and about three years ago I found that if I use CMake working with the Microsoft compiler is painless. For the simple stuff I do using Visual Studio is painful because I don't really want to learn to navigate another IDE.

I'd heard that you could run Ubuntu inside Microsoft so I gave it a try. As far as I'm concern it works like a dream. The real reason that I've always used Debian based operating systems is I love the ease of their package manager.

Since I'm not a LAN administrator I've just avoided PowerShell at work, but I was home sick the last week and I decided to sit down and learn it. The reason put it off so long is Microsoft, in my opinion, has a history of giving up on products. VB6 was quite pleasant to work with, you really could be a complete novice and get things done, and one day Microsoft says sorry were done with that. I'm never going back to cmd again, I think I can say I love PowerShell.

So thanks to Ubuntu, CMake, and PowerShell I'm becoming a full-fledged Microsoft fanboy.
cygwin ports most of the unix command line to windows. It has g++ and many other languages like python, as well as its own shell (I prefer to integrate it into powershell or cmd with path) program and more, and it can compile many unix only source code projects on windows.

this isnt really learning to use windows, though, its more of pulling over what you are used to. Its a little better than running a VM -- its faster, and the executables are built for windows, so not translated, etc. It has limits, but its powerful.

... vb is still in visual studio, you know? They took individual products and merged it all into the one IDE. The only oddball I know of is visual fortran which was not a microsoft product anyway. It went.. somewhere, I lost track of what happened to it.
Last edited on
vb is still in visual studio, you know?
Not VB6! VisualBasic.NET is quite different from VB6.

The only oddball I know of is visual fortran
I believe you're forgetting Visual FoxPro.
Not VB6! VisualBasic.NET is quite different from VB6.

well, yea, and managed c++ is a lot different from MFC too. Its "new and not really improved" in both cases, that is just how M$ rolls, but its "there". Ah, the good old days...

I believe you're forgetting Visual FoxPro.
I am not sure I knew what that was then or now!
Last edited on
Managed C++ is also not comparable to standard C++.

MFC is a MS attempt to C++ify the C-based Windows API. It hides a lot of the nitty-gritty house-keeping details a programmer had to muck around with.
Its "new and not really improved" in both cases
I think you're looking through rose-tinted glasses. VB6 was a pain to use, which is unsurprising, given that it was basically QuickBASIC with various features bolted-on through the years. VB.NET is a proper modern language, although it's kind of been left in the dust by now. It was meant to on-board VB6 developers to the .NET platform so they could eventually migrate to C#.
Maybe. I just know I could jump into vb6 without much more knowledge than I had from apple 2e days and get things working. The .net upgrades made it more of a struggle than it was worth; if I wanted to deal with OOP & complexity I would not have gone for the basic! But I agree, everything seems to be a funnel towards c#. I don't hate C# but I hate the push to try to make us use it.
I believe you're forgetting Visual FoxPro.

Visual FoxPro was the windows version of FoxPro which was originally a dBase-based programming language (such as Clipper etc) before evolving into object-orientated com-based development environment. The last version was issued in 2007 with support discontinued in 2015.

FoxPro was developed by Fox Software in the mid 1980's. Fox Software became part of Microsoft in 1992. The language was similar to Basic, was easy to use/learn and could 'easily' create quite complex data-based applications.
jonnin wrote:
I just know I could jump into vb6 without much more knowledge than I had from apple 2e days and get things working. The .net upgrades made it more of a struggle than it was worth; if I wanted to deal with OOP & complexity I would not have gone for the basic!


I could not agree more; VB6 suffered because it was so good at what it did that developers chose to use it when they shouldn't. Then people blamed VB6. Whom they should have blamed were themselves for choosing a language that does scale well making it hard to maintain. VB6 was a programming language designed for simple problems, and to be easily understood by non-professionals.

I write a lot of perl scripts because it really does make easy problems easy. People get angry because it's not a good choice for a complex problem. When they should really be angry that someone chose perl for a complex problem in the first place; it is just a scripting language, but powerful enough to get something working for a complex problem, and a really bad choice.
I've never found a reason to use a Linux OS. Anything you can do in Linux, especially these days, can be done just as easily in Windows.

The only semi-valid reason I heard was that since Linux has a lot less viruses written for it, you can run Linux then have a VM of Windows for everything you need it for - like gaming. While it makes sense, its really grasping at straws to just not run Windows natively.
you can run Linux then have a VM of Windows


Windows Subsystem for Linux is a compatibility layer for running Linux binary executables natively on Windows 10, Windows 11, and Windows Server 2019.
I've never found a reason to use a Linux OS


well, you will :)
lots of places are going to have servers using linux. you may not run it at home or even on your work computer (there isnt any point, as you already have seen) but you WILL very likely at some point need to putty into a server to set your password or copy a file or some stupid little task. I had to do it all the time in my previous job, constantly having to look at (grep at) a log file or data dump or some other such thing. A basic ability to use the unix command line is a big ++ if your company has linux servers which is likely if the area is "IT" focused (IT meaning cloud, networked, multi platform, service etc as compared to making a program for end uses to install type development).

running it at home if you are not a pro sysadmin is an exercise in frustration. You will learn a lot, but its hard to be productive when 1/3 or so of everything you download fails to build or work out of the box.
Last edited on
For me, Linux on the desktop is just an untenable proposition. There's lots of little things that just don't work properly. But for a server OS it works great. I have a couple boxes at home running various services, like a file server, a firewall and VPN, some VMs, etc. One is even plugged into the auxiliary of some speakers so I can play music in the mornings and control it from my phone (to some extent). Running Linux servers at home is great if you don't want to be totally lost when you eventually need to SSH into a VPS.
If you don't want to go full-out and build a physical server, running small VMs is a lightweight option. Hyper-V is great these days and you get pretty much native performance. Even if you don't have anything to put on a server, you should at least be familiar with building and running things on UNIX.

you can run Linux then have a VM of Windows for everything you need it for - like gaming.
Hah! Good luck. Gaming inside a VM is just not the same, especially for newer games. You need to be able to do PCIe passthrough, and both the hardware and software can be extremely finicky. YouTube "7 gamers 1 CPU". Some of the compatibility issues mentioned are unique to what they were trying to do, but others you will get any time you try to do passthrough. Support for it is really spotty on consumer hardware.
Linux ... There's lots of little things that just don't work properly
I've been using KDE as my primary desktop for a while. I find it satisfactory.

Whenever I am confronted with Windows, I just think of a tedious monolith that I have to be paid to use from time to time. The delays, the quirks, the inconsistencies, ... the mess of it.
I found this quote once, but I could never find it again after a while. I think it's a good attitude to have.

All software sucks. All hardware sucks. Use whatever hurts the least.
helios wrote:
All software sucks. All hardware sucks. Use whatever hurts the least.


Not sure I agree, but I would say all software has issues, or at least all operating systems have issues. The laptops I set up for the grandkids are running Linux mint. I tried setting them up with windows, but it was a hassle they brought them back in a horrible state and I got tried of trying to figure out the problem. I'm nearly sure it was an update that was interrupted, both times.

For me having them use Linux is a dream, all they use the laptops for is google docs for school, to watch YouTube videos, and occasionally play minecraft. If I walk by and see their laptops when they're at my place I take a moment run: sudo apt update, and sudo apt upgrade and everything is good.

While I'm saying how much I like Microsoft, it is often a pain in-the-ass for simple stuff.
While I'm saying how much I like Microsoft, it is often a pain in-the-ass for simple stuff.

Its all in personal experience, familiarity, and so on. Ill be the first to grant you that a windows machine that has been tampered with by a low skill user is hopeless. But low skill users do not tend to install linux, so there isnt a fair way to compare that -- I can give you one example, we had an intern run a linux executable on our silicon craphics and it wiped the OS out, which then required us to do some godawful thing called a butterfly format of the disk and then reinstall it which took a team of 3 people on the phone to walk us though. Thankfully since the hardware cost 25 times what anything else did, they had some service and support; linux does not have even this on the free versions.

updating current windows is as simple as leaving it alone, it does it for you, hands-off, and the most you know about it is that it rebooted and you have to log in again.

on the flipside, I put in a second hard drive in a linux box and it required recompiling the operating system to get it to work again, which took like 2 days of trying to figure out a bunch of confusingly named settings related to miss manners and mr proper or something like that. Windows, you plug in the drive, it says it found it, downloads the driver for you, reboots, prompts you to format if needed, and its ready to go. I have never had to recompile windows.

which is all just to say that you can have an awful time with either one of them, in the hands of a novice or after a novice has screwed with them. Neither one is remotely idiot proof, and once its busted, fixing it can take days. Both have places where doing something basic is stupidly difficult (consider moving junk off the C drive to preserve your SSD, for example, its a nightmare even for experienced users).
Last edited on
I switched back from Ubuntu to win10 at the end of 2019. too many bugs and errors. Too much time wasted for troubleshooting and a lot of little software I needed was simply not available. Much happier now. Eventhough win10 has it‘s issues.

https://printsbery.com/planner-templates

Last edited on
but you WILL very likely at some point need to putty into a server to set your password or copy a file or some stupid little task

Well, for university I did stuff like that, I just meant personally. Though, now I recall that I have a bunch of raspberry pis which all use Linux, so I have been using Linux for personal use.

Honestly, I hate using it. Too many things to take care of. Change that setting, get this library, use that thing.. To code for a Bluetooth module, it took a while just to setup the environment. But when using a microcontroller, programming the Bluetooth module took almost no time at all.

These kinds of niches usually can't get away from using Linux at some level, but I always dread using it. If for no other reason, I hate the reliance on the terminal which can often give vague errors and wacky behavior.

I remember my TA in a class gave us a list of terminal commands to get our assignments off of Github. I barely used Github at that point and his terminal commands had an error, it was missing an important line. I, of course, don't notice, and that resulted in about a day of confusion and the TA being a complete condescending ass about his own mistake that I fixed.

GOD, I could rant about that TA as much as some of these awful professors.


Hah! Good luck. Gaming inside a VM is just not the same, especially for newer games.

That's what I saw a youtuber do, literally gaming on Linux with a VM of Windows. I'm fairly certain most if not all games would suffer drastically because of it, but he REALLY liked being safe from viruses I guess.
most games target the low-middle hardware: they have to work in VM, on 'gaming laptops' which have reduced capacity over top desktops, and on older systems with old graphics cards, low end ram, etc.
I think the current target is to run (in reduced graphics modes, but to function reasonably) on 16 gb ram (vs 64 on most new machines), I5 level processers with 4 cores (vs 10+ cores on the I9s) and 1070+ or even lower graphics cards (currently in the 3k generation, though the numbers at times seem totally random).

which tells you that a top end machine can run a game reasonably well in a VM. Look at all that money they saved not buying windows but spending $4k USD on a linux box :P
Pages: 123