Of all the different "PCs" I've used over the years, Macs would definitely rank in the least used category -- even less than linux. I remember back in the day in elementary school using them to learn about animation (hypercard) and doing these geography tests and stuff like that. Fast forward a lot of years, and I've really only been exposed to my mac book air (which wasn't cheap).
At some point in time, Apple decided to go with the "PowerPC" architecture, meaning Intel and nvidia would become basically the main provider of hardware for the mac operating system. That's right, the same manufacturers that all those windows users use, Mac's use the same (again, making the argument of Mac vs Pc kind of dumb).
Ok, with that out of the way, I must admit that my exposure to macs is pretty limited. I'm actually using it right now to type up this blog entry and this is pretty much where I prefer using it... in daily stuff like checking email, trolling message boards, that kinda stuff. It also helps that it's a laptop (or nettop, or whatever the buzzword is for these small things are). I'll not even bother arguing about the portability since that's kind of a moot point since all laptops inherently have the same qualities irrespective of the operating system. One thing I will point out; however, is that the track pad on this macbook air is awesome for how large it is and how slick the whole thing is put together in general.
Again, growing up with windows computers, I sort of got used to the idea of right clicking and how to interact with my computer through the control panel, how to run dos games (back in the win 3.1 era, etc), how to make desktop shortcuts... etc... well, now all of this is basically gone... well, not gone, but now different. I think the relearning curve is enough usually to make a lot of computer people immediately throw a mac in the "I don't want to learn the difference" category. This is ok, I was like that too.
So, let's get to the meat and bones of the mac article. I will say that my linux background helped a lot (OS X is based on the BSD kernel I believe)... but that's not to say it was required. One of my first tasks to using a mac was learning how to program for one. My primary reason for having a mac was to port windows software (games) to it. I found it practically insulting that I was *required* to sign a developer agreement with apple before I could even get development tools. I couldn't just get GNU tools, or linux tools or anything without having to sign this dumb agreement first. That was a huge strike against Apple for me in general. After I downloaded XCode (which I don't really use) I could finally get some development toolchains, including the familiar GNU tools (gcc, make, etc).
I played with XCode a bit and it seems alright, but wow, there's enough Mac specific stuff in there that it was a bit overwhelming. What was a universal binary? I can pack multiple binary types in a program? How do I make an APP bundle? How do link static libraries to my program? Why can't I just use g++ like normal? There was a lot of questions to be asked and it took me quite a while to figure them out (hint: I'm using my linux dev environment and basically skipped out on XCode). Now, before I make some Mac user angry about this, I will admit XCode seems like ok software, I just haven't had time to give it a fair chance. Remember, my primary reason for developing on Mac was to port windows and linux software, since I already have makefiles and lots of GNU stuff setup, it was simply easier for me to go with what was familiar and already provided.
Anyways, back to the Mac itself. Once I learned the basic tricks, like right clicking is holding down CTRL and clicking the mouse button, it was pretty easy to do stuff. The "grandma friendly" scale, I'll give a 8/10. You basically have few "complicated" options to do anything and I found the walkthroughs for various software (including setting up the machine) incredibly straight-forward and easy to understand. It was plain english and that's what I feel is a worth while thing to give to the... uhh.. less technical.
At one point I had to enable the root user (which is disabled at first) to do something (I think it was to enable multiple user accounts on the machine if I remember right?). A walkthrough with how to do this can be found on Apple's site:
Once that was done, I was back to "almost linux" land using more familiar things, albeit simplified.
One of the next "nice" features was the automatic software update that OS X likes to do constantly. Ironic that windows update I always disable, yet allow the mac to basically do the same and I'm happy about it. I found over the years too many problems with microsoft sending bad updates that I turned it off years ago and never looked back.
Now for the huge issue with Macs. The hardware options are limited and incredibly (and even unexplainably?) expensive. This is both a huge folly yet a major advantage at the same time. Yes, you heard me windows users, the lack of hardware options is actually a good thing but probably for a reason you're not expecting.
When I'm programming software for a mac, it is basically guaranteed to work on all macs that match the operating system's version or higher. If I'm programming an OpenGL application, I don't have to worry about the vast majority of the game "not working" on someone's computer, because we're all running more or less "standard" hardware. On top of that, the system actually runs more efficiently because it's not "bogged down" with drivers, "dll hell", and version incompatibilities. I've spent significantly less time worrying about software compatibility issues on macs than on windows computers (or linux for that matter). This is almost the same argument for why developing for game consoles is easier than for computers, because they run standard hardware, I don't have to worry about the massive amount of complex problems with an almost unfathomable amount of hardware variations among users.
Without really getting in to it, know that developing for Macs requires less time in general testing and coding for very complicated hardware compatibility issues. If my software works on a mac, it will probably work for a good majority of them, and if it doesn't, it will require very little time and effort to enable reasonable compatibility. Windows/Linux systems, this is no where near the case.
Again, standardized hardware vastly simplifies a software developer's job and basically ensures a wider adoption of their software for that platform. I'll explain more of these problems on windows machines in the next article.
All in all, running OS X is a pretty good system. The hardware required to run it is both limited and expensive. I'm unable to pop open the cover and replace parts in and out without voiding warranties. On the plus side, Apple seems pretty forgiving if your system dies within the warranty (and even sometimes even outside the warranty period).
The system is definitely simplified for people that don't need to know a whole lot about computers to use one. If you can afford it, I would highly recommend getting them for people that like one button mice and the almost total inability to damage their own system inadvertently.
Finally, I'm going to say the idea that "macs get less viruses" is an insane argument for why "Macs are better". This is the same reason why there are more games for windows computers; it all comes down to market share. All computers run programs -- well at least ones with operating systems anyways I would hope. Because of this, all computers can run viruses. The author of any software looks at their market. If their market is substantially more windows than macs, they will likely focus their efforts on making viruses for windows. This is pretty basic business sense.
I am saying that yes, there are less viruses for macs than windows, but this is a false sense of security.
In summary, less options, less complexity, more expensive, very easy to learn if you give it a chance. Recommended for casual users and people using it for anything other than games (again, due to market share).