VIDEO CARDS

Video cards plug into the mainboard and are connected by a cable to the monitor. They convert information from the computer to information that a monitor can understand and display on the screen. Both the monitor and the video card must work together to display the best possible picture.

There have been standards established so that the manufacturers of both monitors and video cards work toward the same goal. Most of the early standards in color graphics were set by the biggest player in the home computer market - IBM.

The first color graphics from IBM was CGA (Color Graphics Adapter) and was pretty crude - it could display 4 colors on a 640x200 pixel screen.

That was followed by EGA (Enhanced Graphics Adapter) which could display up to 16 of 64 colors on a 640x350 pixel screen.

Then, when IBM introduced their PS/2 line of computers, they introduced VGA (Video Graphics Array) which is about the lowest resolution that may still be in use today. VGA displays graphics on a 640x480 pixel screen and text on a 720x400 pixel screen in 256 of a possible 256,000 colors.

The next standard (the most common in use today) was the SVGA (Super VGA) 800x600, followed by XGA (eXtended Graphics Array) 1074x768 and then UVGA (Ultra VGA) 1280x1024.

Usually folks just quote the size now instead of the letters - they'll say "800 by 600" rather than "SVGA".

The number of colors displayed is a factor of VIDEO MEMORY - or vice versa; the amount of video memory will determine the number of colors that can be displayed. If each pixel has 4 bits of memory assigned to it, it could be one of 16 possible colors. If it has 8 bits, it could one of 265 possible colors. 16 bits would give you 64 thousand possible colors and 24 bits would give you over 16 million possible colors. If you have a video card with 1 Megabyte of memory, you could have 64K (16 bit)color at 800x600 resolution. To get the higher number of colors (24 bit), you'd need 2 Mb of video memory. On the other hand, if you changed your screen resolution to 1280x1024 you'd only be able to display 16 colors - you'd need 4 Mb of video memory to be able to display the maximum number of colors.

The math is really pretty easy if you remember that Mega means million and byte means 8 bits. So one Megabyte would be 8 million bits. Then, at your 800x600 pixel resolution you've got 480,000 individual pixels that need color assigned to them How many colors can you have? Divide the amount of memory you've got (in this case 8 million bits) by the number of pixels you want to use (480 thousand) and you come up with a little over 16.6. So you've got enough memory for 16 bit color but not enough for 24 bit color.

So, what you see on your screen has a lot of factors....the video card, the monitor itself, the screen resolution and the refresh rate.

REFRESH RATE

Beyond the number of pixels you want to display and the depth of color you might need is how often this information needs to be sent to the monitor. If you are looking at a high-resolution photograph on your screen, then the 11.5 million or so bits being used to create the picture would stay the same each time your monitor's screen is being refreshed - say 60 times a second. But if you're looking at a changing picture, a number (or maybe all) of those 11.5 million bits need to be recalculated every 1/60th of each second and stored in the proper place, ready to be sent to the monitor. I'm not even going to begin to try to explain what happens - I sure don't know - call it magic! I know that some high-end video cards have enough memory to have 4 (or more) complete screens in memory and a built-in computer that is doing the calculations necessary to keep all four (or more) screens updated in sequence, so when one screen is being fed to the monitor, the one behind it is ready to go. And that's just one of many schemes being used to try to keep up with modern demands.

The original video display adaptor cards were 8 bit ISA cards - considered obsolete by today's standards. The 16 bit ISA cards that followed are still being used in older machines today but the modern card of choice today is the 32 bit/64 bit PCI card with a minimum of 4 Megabytes of video memory. The newest Pentium-II and Pentium-III mainboards have a dedicated video slot called an AGP (Accelerated Graphics Port) slot for the latest type of video display adaptors. That, of course, requires a special type of video card designed for the AGP slot, which is a variation of the PCI slot.

And, like all computer information, by the time you read these words this information is probably outdated and some newer standard has taken over.

C'est la vie.

If you got here from somewhere other than the table of contents, use the Back Button at the upper left hand side of your screen to go back. It looks like this: Back Button

Click here to return to table of contents.