Once upon a time, the video card was there so you could see stuff. It didn’t have to do much, really; just toss up some text characters and maybe some rudimentary graphics on the screen. If your computer didn’t already have a video card then it was built in, and that meant you had a “game” computer, not a power house business computer.
Which is why those of us with Ataris and Commodores and Apples and CoCos laughed at IBM users: they couldn’t play games that were very good or entertaining.
At some point all that changed and hardware accelerated 2D graphics cards starting hitting the market. It wasn’t long before that IBM PC compatible tossed away the monochrome and 4 color graphics away and started to hit the 16.4 million color mark. That wasn’t good enough and along came 3DFX with their VooDoo card, which launched an entire industry.
Now, the part that used to be there only so you could see what you were typing has become a computer in its own right, sometimes more powerful than the computer it’s sitting in.
I hate buying video cards.
It’s gone from being easy to being difficult. Back in the Not-as-old-as-monochrome-but-still-pretty-far-back days it was fairly easy. If you had an ATI Radeon 7000 you could be reasonably sure that a Radeon 7200 was going to be better. Similarly, if you had a creaky ol’ GeForce 2 then you could rest easy knowing that a GeForce 3 would be better. Or, if you were one of those weirdos on the fringe you could rest assured your Matrix G400 was reasonably better than your Matrox G200. Probably.
But now it’s a whole new ball game. New video cards come out, geez, I guess monthly now. By the time you buy the latest and greatest and bring it home it’s already out of date.
Let me backtrack a little bit and explain why I was building this new computer in the first place: to attach a third monitor. My NVIDIA card only supported two monitors and to get a third going I would have to buy another 470 GTX and hook them up together. Only my motherboard had an AMD chipset and would only support AMD’s Crossfire, not NVIDIAs SLI method. This all thanks to AMD buying ATI.
I had, briefly, considered buying an ATI card because they use CrossEye or Eyefinity or something to support three monitors with one card. I didn’t do it, though, because I’m familiar with NVIDIA, didn’t understand ATI’s naming scheme, and knew how to use the NVIDIA driver in Linux (which is actually dirt simple).
So, to get to 470 GTX cards working I would have to replace the motherboard. If I was going to do that, I may as well get that funky new eight core processor. And, since it’s cheap, get some faster memory. And a new case to put it in. There’s no reason not to add some speedy SSDs to the mix, either.
And that’s how things spiral out of control.
But now I was free to get another 470 GTX. The only problem was you can’t get them anymore. Maybe you can find one used, but you won’t find a new one. You’ll find older cards and newer cards, but not the 470. After doing a lot of research I came to the conclusion that the reason for this is that the 470 was just an excellent and speedy card. If you had a 470 there was little reason to buy a new card (unless you wanted three monitors) so it had to die.
That’s where the naming confusion comes in. You see, in my mind, anything with a higher number should be better than something with a lower number; hence, a 540 GTX should still be better than a 470 GTX. This is not the case. In fact, the only card I could figure that was better than a 470 GTX was a 580GTX. A 580 GTX was way more expensive than a 470 would have been, had you been able to get one.
As I was walking through Fry’s, I noticed that there was a new card: The 680 GTX. It was $600. I didn’t want to pay $600 for a video card. So I went home. And I spent the next few weeks, literally, comparing NVIDIA cards to figure out which two (had to get a pair, see) I would decide on. I did add the 680 to my comparisons, though, just to see how awesome this new card was compared to the older ones.
That’s when I saw that with the new 680, NVIDIA made some changes and made it possible to use three monitors with one card. Which is why, after weeks of soul searching, wallet patting, and painstaking research I eventually decided to throw caution to the wind and get the newest card available: The 680 GTX.
After that, it was just a matter of being able to find the stupid thing. It turns out, the 680 is so new that they were all being bought up as soon as they hit the shelves. So it was days of adding myself to email notifications to be alerted to when one of these things showed up. Online, offline, it didn’t matter to me. It took a random trip to Fry’s to find one (after being told on the phone that they didn’t have any) to finally snag one. So I brought it home.
Did you see the irony there? The bit where I started this whole project to use three monitors and ended up getting a video card that supported three monitors (four, actually, but that’s a whole other thing)? This is how the universe messes with me. For bonus amusement, the 690 GTX was announced just a couple of days after I got the 680. Not that I mind too much as I don’t have an interest in it, but that’s the way things go, sometimes.