The dominant force on "PCs" from over the past 25ish years all through today, microsoft windows. Practically everyone has at least been exposed to it in some form throughout their lives. Why this high exposure rate exists is kind of controversial, but if I remember right, microsoft basically gave away Windows 2.x to anyone and everyone and turned a blind eye to "pirating". I guess they also bought out their early competitors, invested in apple stock, borrowed ideas and did some other stuff that would essentially guarantee that most people would use windows, especially in the business environment.
Today, computers running windows are known to be very customizable at the hardware level. I will admit, one of the good things about windows is that I can create a computer with almost any component of hardware (as long as they are compatible with each other at the hardware level of course), slap windows on the machine, and it just works for the most part. Sure, sometimes I have to download an updated driver or two from the internet, but that's usually not a big problem. There were some snags in earlier versions of windows (win 95, 98, ME, vista) that sometimes caused a lot of headaches, but for the most part, in win XP and win 7, those seem to be mostly resolved.
Because of its high market share, many game developers target this operating system for "PC gaming". This is both a blessing, and a curse at the same time. Windows provides literally tons of drivers and compatibility software to make sure, well for the most part, that windows and software run on windows can utilize your hardware. Unfortunately, it is almost inconceivable to imagine the millions or even billions of possible combinations of hardware that could exist in the world.
Windows attempts to "guarantee" that software built to run on a "PC" will just work, as is evident with their DirectX software (which is a whole other issue I might talk about later). Back in the DOS days, software developers could get "direct access" to your hardware more or less -- specifically your graphics card. This allowed amazing performance and disasterous situations when the software didn't talk to every individual card exactly correctly. Windows attempted to fix this situation by forcing all software to "talk through it" first before communicating with any hardware in the computer, so that it could provide a translation like service. This process is a bit complicated, so I'll skip it for now.
I was talking about games, so yes, windows is generally known as the "gaming platform" of choice when it comes to playing games on computers. Developers have to spend a lot of their development time on compatibility issues since there's so many different hardware combinations out there. For example, Intel CPUs versus AMD CPUs, each have their own defined behavior for various opertions, specifically for things like significant digits or when "rounding" numbers using decimal numbers (floating point).
Ever wonder why you might fall through a floor in some game? Well, that bug might only affect Intel CPUs... or just a subset of Intel CPUs and without a hardware collection for this situation that the software developers could test their software on, that bug will likely ship in to the released product, forcing a maintenence patch sooner or later to fix it. With macs, this problem is vastly reduced because of substantially less "hardware profiles". So, while enabling the end user to have almost unlimited choices in terms of differing hardware, every combination poses an interesting and unique challenge for software developers and gives rise to the idea that "windows software is buggy" in general. Heck, even windows itself has been historically known to be "unstable", "buggy", and "unreliable"... now, I'm not trying to sound like a windows "fan boy", but I can't completely blame windows for having this reputation... for what it is doing, it does a pretty good job mangling the vast quantity of potentially incompatible hardware, and working... well, for the most part.
Not only are games targeting the windows market share, but basically every business entity in the world attempts to make software for it -- including hardware vendors that make drivers. This is what kills the linux market, when the hardware vendors either have lax support for their linux line of drivers or have none at all, leaving driver support up to some crazy community member to make the driver in their spare time at home and hope they share it with the rest of the world (thankfully, they usually do). So, it is a bit of a misnomer to think that windows is a "superior" product when they clearly have a lot more time, money, and resources in terms of people supporting the project in the form of tech support, drivers, and compatibility efforts.
Another annoying, yet possibly humorous, thing about windows is the amount of popup questions. "Are you sure you want to cancel this operation?", "Clicking this will install malware on your system, yes or no"... stuff like this... then sometimes you get a train of popup questions... or popup ads... or other annoying questions. The newer versions of windows uses the user access controls, which means it will ask a non-administrator user for their password essentially everytime they want to do almost anything other than browse the internet. Naturally the best idea that most people use is simply use ONLY the administrator account to "reduce the number of popups and questions". Brilliant. With that said, let's talk about viruses and other malware. Again, this is NOT a good marketing ploy by other product lines to say they have less viruses. Like all software vendors that want to target the dominant market player, so to do virus writers target windows. Usually, windows will realize that some software needs administrative rights to do something malicious on your system... like a good operating system, windows will deny access to it (like writing something to your system folders), so it will ask your permission to install or do what the malware wants to do. Problem is, since lots of windows users use their admin accounts, this happens automatically. Congratulations on your new virus and being part of a botnet.
On the "grandma frieldy" scale, I'm going to give the newer versions of windows a 5/10. It's pretty easy to set up, but using it is actually not very straightforward. Almost every application has its own idea with how best to "guide users" through their own configuration and setting up... this process is by far not the same for almost every software, including most software itself from microsoft. A standard process is essential, especially in stuff like business, so people get used to the idea of how to work things. Every time a new version of a microsoft product comes out, they change significant enough stuff to alienate almost all users except the most fluent people already with computers (they can adapt easier to change I guess). I've heard countless times in small businesses that when a new version of MS Office comes out that they have to send everyone to training classes again because they have to learn the software all over again (and pay a hefty price for the software upgrade anyways).
Oh, that's another thing. Windows users are essentially paying for upgrades in almost a subscription like fashion. Are the various versions of windows unique product unto themselves, or is the latest version of windows exactly that? Well, their build numbers would seem to say that windows XP was the next version after windows ME, which was the upgrade to 98... so to is Vista to XP, win7 to vista, etc... in "free" operating systems, or other software in general, you don't pay for upgrades, maybe for support. You go to their website and download a patch, and you're good. Microsoft's software is definitely not like this almost at all, they've been charging their userbase ever since the windows 3.x days (before win95) then without paying for upgrades, they phase you out until you pay up again for the latest versions. This is what came to be known as "forced upgrades".
Sure, you can still use windows 3.1 today if you didn't want to upgrade, in fact, you could reasonably argue that you would get less viruses and be "smart" about computing. I'll leave that as something to think about to the reader.
Developing on DOS was awesome, it came with BASIC (which I started on in my own programming when I was 7 years old)... nowadays, you have to either get Visual Studio, or one of the many free alternatives like http://www.codeblocks.org/. Of course, you can always go with one of the .Net products (not really my forte`). Since I make games on windows (and mac, and linux) I will have to admit that making the software, sure, is relatively easy thanks to copious amounts of information on the internet, and MSDN (which I didn't have to sign my soul away for to read -- no thanks to you Apple and xcode ffs). Yet, when it comes to testing my software on other people's systems... that's when the problems strike (read above) and forces me to spend a LOT of time on compatibility issues, especially pertaining to graphics development.
In summary, windows is a good OS for gaming and general productivity with "office-like" use. It is a general solution to computing needs and has the benefit of being widely supported throughout the world. It's flexibility with tons of hardware is both its greatest benefit as well as its greatest folly. It's likely the easiest to install of the operating systems I reviewed, but it has lots of downfalls as you use it. I didn't even touch the "server line" of the win operating system for a couple reasons, I simply don't use them enough to make an educated opinion about them. I generally see other operating systems as vastly superior in this area than microsofts line (and much, MUCH cheaper).
So, hopefully by now, you, the reader, will have a better appreciation (if you read all this anyways haha) for why the idea of "PC vs Mac" is inane. I compared three different "PCs" and they each had their ups and downs.