If you know the difference between DVI and VGA, skip the first two paragraphs.

I knew nothing about this issue until I got a 22" LCD monitor about a week ago. As I glanced over the documentation, I noticed it listed two cables and had two different ports, but there was only one cable in the box.

Included was a VGA analog cable, the old standard for translating digital data from your graphics card or mother board to an analog image on CRT monitors. Early digital, flat panel displays just translated the signal back, adding latency and causing some degradation, but at the common resolutions at the time, you wouldn't even notice. Still, it didn't take long to come up with DVI--a digital interface eliminating the extra steps and minimizing degradation.

Fast forward to 2008. Walk into an electronics store. Is there a CRT monitor anywhere in sight? Turn your computer around. I'm not sure about motherboards, but if you have a graphics card manufactured in the last five years, it will have a port that looks like this:
or at least this:

Yet I drop three bills on a digital monitor, and they let me walk out of the store with an analog cable, and when I call back they tell me if I want THE PROPER CABLE FOR CONNECTING MY DEVICE, it's another $20-30 bucks.

I got the cable today (it's still not exactly the right one--single channel instead of dual channel, but that makes zero difference on my current system), and at 1680x1050 it's a very noticeable improvement.

So is this Best Buy being shady, or is it not yet standard to package LCDs with a DVI cable?