I built almost every computer I owned before getting my MacBook Pro. I hand picked every part for ultimate performance. I did own a Dell laptop, and an HP one, an AlienWare one (before Dell bought/ruined them), and a Sager laptop.
I still use the Sager. It runs Ubuntu Server edition so I can test and benchmark server software on it. I even just upgraded the hard drives to those Seagate hybrid 1T ones. It looks like a laptop, but weighs like 20lbs. Its a bitch to lug around, so I never really did. I didn't really carry around the others either. Windows and Ubuntu desktop didn't work well without a mouse, and it hated using the trackpads on those systems built to run Windows.
I was never much of an Apple fan. On the hardware side, you got way more bang for your buck building your own rig. And apple's software was just awful until OSX. They'd build a slick packaged hardware platform and put an OS on it that used 110% of its resources. The old macs spent more time OS swapping to floppy disk than doing any useful work. I avoided Apple like the plague.
I'd talk a lot like Natebishop3. Way better hardware for cheap. And you didn't get stuck with OS9 and all the Pascal legacy bullshit.
I got that Sager to run Ubuntu, not Windows. Windows was always an OK X Terminal, but only useful as an OS for lack of anything else. XP? Hahahaha. 3.2G memory limit even if you have a 64 bit CPU and 32G of RAM. Microsoft was absurdly late to the 64 bit world. Vista was a disaster, and unusable. Windows 7 actually worked, but it was still windows.
Microsoft uses a file system called NTFS. It works fine until your hard disk gets 10% full and then all that whiz bang super cheap and blazing fast hardware slows down over time as you use your computer more and more. Seriously, has everyone not noticed that wiping your hard drive and installing Windows fresh makes your computer run fast again, like when it was new?
So I worked in a Windows shop. Windows server for email and other things, windows XP on every Dell workstation. I bought 8G of RAM for the Dell for $69 out of my own pocket and installed it. Then installed Ubuntu 64 on it and ran Windows in a VM to access the 2 mission critical apps required by my job. One was a .NET app that would not display properly in any browser but Internet Exploder... This legacy hell is a huge reason to stay away from M$ in the first place.
Ubuntu worked great. It ran all the things I needed to do 90% of my job. I never liked OpenOffice, so I used M$ Office in the VM. I have been a Unix geek since the mid 1980s when it ran on PDP 11s and Vax systems. I ran Linux in the early days when the kernel version was 0.3.x and exchanged emails with Linus regularly.
I built data center scaled applications using FreeBSD, which was just better software (every algorithm well done). I knew many of the core committers, the best ones ended up hired by Apple.
Ubuntu showed the promise of what Unix can do for the desktop. The graphical UI was on par with Windows, but the applications were amateur and not often as well developed as the commercial ones you could get for Windows. Like GIMP vs. PhotoShop? Adobe did it way better.
Apple's hardware is plenty good enough. I play Diablo 3 on the laptop with all graphics options set to max and get a fast frame rate. What I also get is 8 hours of battery life compared to the 2 on my Windows class laptops, or the 20 minutes on the Sager.
The operating system is just so well done. It's Unix under the hood, but the Cocoa UI makes stunningly great apps.
Why spend less money on better hardware that runs shitty software?
I learned my lesson.
As for games, that's what the PS4 is for.