Posting-Frequency: monthly (second Monday)
COMP.SYS.IBM.PC.HARDWARE.VIDEO Frequently Asked Questions - Part 3/4
Q) How does a video accelerator work, and will one help me?
The term accelerator is used so frequently that it has lost much of
its meaning. This section is intended to answer how a video card with
special purpose video acceleration works, typically called 'Windows
accelerator' or 'coprocessed' cards. In a general sense, the principals
here can be applied to 2D, 3D and digital video acceleration. For more
specific information about 3D and digital video acceleration, see "How
does a 3D graphics accelerator work?" and "What does a video codec do?".
Before we get into acceleration, we have to understand how a VGA card
A VGA card is a simple display adapter with no processing capability.
All the thinking is done by the CPU, including writing and reading of
text, and drawing of simple graphics primitives like pixels, lines and
memory transfers for images.
Programs like most DOS-based word processors run in VGA text mode
while graphics-based programs like games run in graphics mode. Microsoft
Windows 3.1 runs in VGA graphics mode as default, meaning that every pixel
you see as a part of the background, a window or text character had to
be written using basic VGA calls. As you can imagine, the low-level
nature of the VGA command set means that many commands are required to do
something as simple as moving or closing a window. To move a window, the
VGA commands might go something like this:
-Block transfer to store window contents in PC RAM
-Solid rectangle fill (to blank window - cosmetic)
-Block transfer to put window in new location in VGA RAM
-Block transfer or Write pixel to rewrite background behind
old window location.
Clearly, an enormous amount of data must move from the VGA card,
along the bus, into the CPU, and on into memory, and vice versa. This
has to occur because the VGA card has no processing capability of its
own, it relies on the CPU. Now we are in a position to understand how
a graphics accelerator works.
A VGA card has its own memory and digital-to-analog converter (DAC),
but can't actually process data. Accelerated video cards have their own
processor, and therefore are called video coprocessors. This means such
a card can perform many video operations by itself, with only minimal
input from the CPU. Let's go back to our example of moving a window.
Assume our 'accelerated' card can keep track of:
-the background fill pattern
-the location and contents of rectangular regions, i.e. windows
-and has adequate memory to store them.
To move a window, the CPU has to transmit something like:
-'move window' instruction
-location to move to
At this point, the video card can perform all of the operations the
CPU would have had to with a VGA card. This frees the bus and CPU to
execute other tasks, and speeds-up video operations as they're all done
on the video card. Why is this faster? Unlike VGA mode, where every
pixel has to be moved to and from the card via the bus and CPU, the
accelerated card can perform the same operations with instructions
consisting of only a few bytes being transferred along the bus. This
will result in an enormous performance gain for most common graphics
operations including bitmap and pixmap transfers and painting, movement
of sprites and icons, opening and closing of windows, filling with solid
colours and patterns, line drawing, polygon painting, etc. As a result,
even an ISA bus accelerator video card can provide blistering speed
improvements over VGA in graphical environments like Windows 3.1, OS/2,
X Windows (i.e. XFree86) and AutoCAD. Some operations like animations
or raw video playback which require large block transfers at high rates
will benefit less from accelerator cards.
Some newer accelerator cards include functions for 3D graphics
rendering like polygon shading, coordinate manipulation and texture
mapping. Others provide on-the-fly magnification of video clips so
that those MPEG movies don't appear in a box that's three inches wide
and two inches high on your screen.
However, keep in mind that the implementation of a given video
coprocessor is proprietary. This means we're tied to a system where
every video accelerator has a set of proprietary drivers which interpret
video commands. Different drivers are required for each operating system
or software program that wishes to take advantage of acceleration
functions. Some 3D graphics standards like SGI's OpenGL and PHIGS are
being integrated into workstation video hardware, and perhaps in the
future a 3D (or even 2D!) standard will be accepted by PC component
manufacturers to provide a consistent set of video instructions for
Q) What does a video codec do?
Anybody who has played-back a movie on their computer knows that the
video is choppy and low resolution. The reason is that current PC
technology simply can't handle the amount of data required to display
uncompressed full-screen video. To understand why, we just have to
look at the amount of data contained in a video clip. If we want to
record a standard video signal for digital playback, we have to
digitize it at about 640x480 pixels/frame. At a refresh rate of 30
fps (frames per second), and true colour (16.7 million) we would be
pumping 640x480x30x3 = 28 Mbytes/s through our computer. At that data
rate, a 650 Mbyte CDROM would hold only 23 seconds of video! CDROM
reader and hard drive technologies don't allow us to transfer data at
such high rates, so in order to display digital video it is compressed
Compressed video streams are read from a hard drive or CDROM, then are
decompressed before being displayed. This decompression is very CPU
intensive, and displaying the resulting video pushes the limits of
the peripheral bus (usually ISA, VLB or PCI) and video cards. If any
of the hard drive/CDROM reader, CPU, bus or video card can't keep up
with the high amount of data, the video clip appears choppy, or is
displayed very small.
The software or hardware that performs the decompression (or
compression when recording video) is called a codec (compression-
decompression). Dedicated hardware codecs are available either as
add-in cards or are integrated into video cards. The advantage of
such hardware is that it is optimized specifically for the quick
decompression and display of video data, so can provide higher
frame rates and larger images than a computer using a purely
software-based codec routine. Hardware codecs also reduce the
computing load on the system CPU, allowing it to perform other tasks.
Several types of compressed video formats exist, including MPEG
(Motion Pictures Experts Group), AVI, MOV, Indeo, MS-Video, Cinepak
and Quicktime. In addition, different versions of these formats
exist, some incorporating sound. Under optimal conditions, some of
these formats can provide compression ratios of up to 100:1 while
still providing good quality video.
Some hardware codecs are optimized to work best with a particular
video format, but most support the basic operations required to
display compressed digital video streams.
Any given digital video accelerator may support some or all of the
Codec - Decompression of compressed video from various formats.
Colour space conversion - Conversion of the video signal from YUV
colour space to computer-display-compatible RGB. The YUV colour
space is derived from the composite video signal that is the source
of most video clips.
Image clipping, filtering and scaling - Filtering reduces the amount
of graininess in the image. Scaling can be of different types:
Pixel replication - This simply means that pixels are doubled
in both the x and y directions - a 320x240 image is displayed
as a 640x480 image with larger pixels. This results in poor
Pixel interpolation - Uses an image processing filter (usually
an averaging algorithm) to interpolate pixel values. This
provides a smoother image than direct pixel replication.
Some of the new video cards provide a degree of hardware acceleration
for video playback, while others claim to provide full-screen 30 fps
video but don't have the necessary hardware. My advice is to test
drive any card that you are considering in a machine that is similarly
configured to your own before buying.
Q) How does a 3D graphics accelerator work?
As you know, the vast majority of computer displays are two-dimensional.
As a result, most of the objects which are represented on computers are
also 2D. Examples of 2D object include text, images and animations. Of
course, most of the world is 3D, so there are obvious advantages in being
able to represent real-world objects in a realistic way.
The 3D representation that I'm referring to here is really surface
modeling, but involves true 3d objects. This shouldn't be confused
with games like Doom or Wolfenstein 3d, which are really just souped-up
The way that 3D objects are traditionally represented is using a meshwork
of polygons - usually triangles - to describe their outside surface. If
enough polygons are used, then even curved surfaces can look smooth when
projected onto the computer display. The minimum parameters which have
to be defined to describe a 3D object and its view; The coordinates of
the object's polygon vertices (corners), polygon (or vertex) normals (to
tell us which side of the polygon is pointing out, and which is inside
the object, and for shading purposes), reflection characteristics of the
polygonal surfaces, the coordinates of the viewer's location, the
location and intensity of the light source(s), the location and
orientation of the plane where the 3D scene will be projected on (i.e.
the computer screen). Once all of this information is available, the
computer performs a process where it projects the 3D scene, given the
above information, onto the 2D computer screen. This process is called
rendering, and involves equations for tracing from the viewer through the
scene, equations for determining how light is reflected from light
sources, off of objects and back to the viewer, and algorithms for
determining which objects in the scene are visible, and which are
obscured. Often, depth cueing is also performed to make distant objects
darker, giving move of a 3D feel.
The point of this description is to impress upon you that the 3D
rendering process is highly complex, and involves an enormous number of
computations, even for simple scenes with few objects and light sources
and no shading. The addition of shading often more than doubles
computational time. If the computer's CPU had to perform all of these
operations, then rendering a scene would be very sluggish, and things
like real-time renderings (i.e. for games or flight simulators) would
not be possible.
Happily, new 3D graphic card technology relieves the CPU of much of the
rendering load. 3D operations are accelerated in a similar manner as
standard windowing operations are for say, Windows 3.1. The application
program is written using a standard 3D graphics library like OpenGL,
Renderman or another. A special-purpose driver, written specifically
for that 3D graphics card, handles all requests through the 3D graphics
library interface, and translates them to the hardware. Using a
software driver adds an additional layer between the application and
video card, and as a result is slower than accessing the hardware
directly. However, most of the 3D video hardware is proprietary, which
means that without a driver, an application developer would have to
write a version of their program for each 3D graphics card available.
An additional advantage to having a driver, is that if a new 3D graphics
standard is released, or an old one is updated, a new driver can be
written to support the new standard.
For the 3D rendering example above, the rendering process can be sped-up
through the use of the special-purpose hardware on the video card.
Instead of the main CPU having to perform all of the operations necessary
to calculate the colour and intensity of each pixel being rendered, all
of the 3D scene information can be sent directly to the video card in its
raw form. Polygon vertices and normals, surface characteristics,
location of the viewer, light sources and projection plane are all
off-loaded to the 3D video card. Then the video card, which is optimized
to perform 3D operations, can determine what image is displayed, and dump
it to the screen, while the system CPU is free to perform other tasks.
For more information on 3D graphics chipsets and card model
specifications, refer to:
the PC 3D Graphics Accelerators FAQ
And some additional info, with a number of links to more information about
specific 3D chipsets and manufacturers;
Here are a couple of other links, which have information on a large
number of 3D graphics standards, and also give some insight into how
some popular 3D gaming engines work;
Q) Which video card is best for DOS/Windows/X11/OS/2?
[From: Michael Scott (email@example.com) with some from
Dylan Rhodes (Formerly of Hercules) ]
It would be irresponsible (and very difficult to keep current) for
anyone trying to produce an objective document to suggest which
video cards are 'best'. The answer is complicated, since no one
card is best for all applications. The best video card for you
will depend on:
The operating system you will be using (i.e. DOS/VGA or GUI)
Display addressabilities and colour depths (i.e. 800x600x16 bit)
The bus your computer uses (PCI, VLB, EISA, ISA, MCA)
The types of applications you will be using most.
Also, don't be fooled into thinking that the absolute fastest card
going is the best deal. If you're using anything other than VGA
pixel addressabilities (up to 640x480x4 bit) then driver availability
and stability are very important. You can save yourself a lot of
problems by getting a card that comes with good, solid drivers
for your operating system(s) and good configuration utilities.
Make sure you choose a vendor who can provide quick, accurate and
friendly technical support when you need it and via the means that
you choose (telephone, e-mail, etc.).
That being said, more timely information on available video chipsets
is included in Appendix B.
Q) Is my card supported under Windows 95, OS/2, Linux-XFree86, etc?
In general, all cards provide basic VGA support, and if your card does so
you should be able to run just about any operating system at VGA pixel
However, 640x480 is not high enough pixel addressability for most GUI's.
The best course of action is to contact your card manufacturer to see if
they provide drivers for the OS in question. As an alternative,
monitoring or posting to pertinent newsgroups should get you a quick
answer. For the particular operating system that you wish to use, you
have three choices:
1) The best option is to contact your video card vendor and get the
latest drivers for your card. Make sure you know the make and model number.
Sometimes, the vendor will ask for which chip revision your card is using
i.e. A Tseng ET4000w32 card could be the original w32, w32i or w32p. These
may be available on-line via ftp or www sites, or may be on a BBS someplace
(likely on the other side of the continent). Alternatively, contact the
retailer you bought the card from. The big advantage to getting the drivers
from the card supplier is that they should take full advantage of the card's
capabilities, including using accelerated functions when possible, and
providing high pixel addressability and high colour-depth modes.
2) As an alternative, SVGA drivers will likely come with the operating
system. If these drivers follow the VESA SVGA standards, and your card does
also, you will be able to take advantage of the higher pixel addressability
modes your card is capable of. Unfortunately, you will _not_ be able to
take advantage of any of the acceleration features of your card, and in many
cases you will not have access to the higher colour depths (like 24 bit
colour). My experience has been that these drivers tend to be quite stable.
3) Usually, vendors are responsible for supplying drivers for their
particular video cards. In many instances, though, the original drivers
were written by the chip manufacturer, then supplied to the vendor. In some
cases, the chip vendor releases generic drivers for a given chipset. These
may be available on an ftp or web site. Such drivers will likely take
advantage of acceleration features of the chipset, but may not know about
some particular features of your model of card.
Q) Which video benchmark is the best?
I won't stand at the pulpit and get carried away, but here are some
things to consider when looking at benchmark figures.
[From: Dylan Rhodes (Formerly of Hercules)]
"Any benchmark program is separated from the real world to some degree.
The fastest benchmark score on the planet means little to the user if
their applications crash, or if they can't get help when they need it."
[Michael Scott (firstname.lastname@example.org)]
1. The first thing to remember is that a benchmark measures the speed of
certain specific operations that the computer is performing. You have to
decide if a given benchmark is measuring anything that is meaningful to
_you_. This isn't always easy, because often benchmark authors don't
provide details on exactly what operations their test suite is performing.
2. Results from one benchmark program can not be extrapolated to other
applications or benchmarks. In particular, VGA (DOS) benchmarks may
be completely unrelated to GUI (i.e. Windows 3.1, OS/2, etc) benchmarks.
This is because the VGA circuits on many video cards are completely
separate from the graphics accelerator (Matrox is an example).
3. Comparisons of the same benchmark on different systems may, or may
_not_ be meaningful. For example:
Most so-called 'video benchmarks' rely heavily on the CPU, and may
not be good indicators of the speed of the video card itself. This is
not necessarily a fault of the benchmark author. For example, the
majority of VGA operations are performed in the CPU, then the raw pixels
are dumped down the bus. This implies that _all_ programs
which measure the speed of VGA operations are highly dependent on CPU
One particularly popular graphics benchmark is 3DBench. This is a
VGA-based benchmark that will _not_ take advantage of any acceleration
capabilities of your video card. It strictly measures DOS VGA speed
which is highly CPU dependent. As a result, it is _not_ a good measure
of video card speed, but rather measures combined CPU _and_ video card
_and_ bus speed. In fact, I believe it was written before VLB even
existed, so I doubt it takes advantage of that, either. It is very
difficult (impossible?) to measure the pure VGA speed of a card because
of this CPU and bus dependency.
GUI-based benchmarks consist of WinMarks, WinStones, WITS, Xstones,
etc. Again, most of these are highly CPU dependent, but the advantage
of these benchmarks is that when used with the appropriate driver for
your video card (i.e. _not_ the VGA/SVGA drivers that come with
Windows 3.1 or XFree86) they can take advantage of your card's
acceleration capabilities. In particular, WITS and WinStone measures
time real-world applications, so they are a closer indicator of how much
of a speed increase you should see on a day-to-day basis.
4. Don't expect a new video card to make your whole system scream.
No matter how fast a video card is, it's only responsible for a portion
of the overall system speed. You won't get Lamborghini performance out
of a Lada, even if you put a V8 in it. :-)
Q) Should I have BIOS shadowing on?
The code which tells the computer how to access the video card is stored on
the video card itself in a ROM (Read Only Memory) chip. When the computer
wants to access the video card, it uses the video BIOS (Basic Input/Output
System) routines on the ROM chip(s). The only real problem with this is
that ROM chips run more slowly that traditional DRAM which is used for main
system RAM. As a result, most (if not all) modern BIOS setup utilities
(sometimes referred to as CMOS) allow the video BIOS to be copied to a
section of main system DRAM (this is the shadowing). This has the benefit
of speeding up video operations between the CPU and video card because the
video BIOS 'instructions' can be read more quickly from the shadow RAM, and
the disadvantage of using a relatively small block of upper memory (the
chunk of memory is located above 640k and below 1 Meg).
When video BIOS shadowing is turned off, some systems and memory managers
allow you to use that chunk of memory to load TSR's (i.e. mouse driver,
cdrom driver) which may allow you to free up some additional conventional
memory. When turned on, video operations will be performed faster, at
the expense of a chunk of upper memory. Unless you're tight for upper
memory or have a compatiblity problem, try running with shadowing on.
Q) Should I use a Universal VESA driver? (i.e. UNIVBE)
The Video Electronics Standards Association has produced a standard for
SVGA video modes, commonly known as VESA VGA or VESA SVGA. This standard
includes the ability to address video memory linearly (i.e. as one large
contiguous block of memory) instead of using the 64k segments that must
be used for a VGA video adapter. Additional enhancements increase the
speed and efficiency of system RAM <--> video RAM transfers. Different
versions of this standard are supported by various different graphics
cards and drivers, but the most common are v 1.2 and most recently 2.0.
This VESA standard allows programmers to support a wide variety of
video devices without having to write hardware-specific drivers. The
cost of having a generic standard is that the code is rarely optimized
for any given video processor. As a result, a native-mode driver will
usually provide better performance that a comparable VESA mode.
The reason for this is that most vendors spend their time optimizing
Windows/Win95 drivers and not BIOS modes. Software VESA drivers like
UniVBE use the faster native-modes for normal BIOS calls, resulting in
improved performance. The speedup is due to the fact that the video
card manufacturer has not fully utilized the capabilities of the video
hardware in the video BIOS.
Most new video cards have the VESA standard implemented in hardware, and
support VESA calls directly, without requiring a software driver. Some
older cards support an older version of the standard, or have errors
and/or inefficiencies in their VESA implementation. Other video cards
do not have VESA support at all.
Depending on what type of card you have, you may or may not see a speed
increase by using a VESA driver. The following guidelines may help you:
If you have a new card which supports VESA 2.0, then you
will not likely see any speed increase with a TSR like UNIVBE, in fact
you may see a slight slow down due to extra overhead associated with
If your card is slightly older and supports VESA 1.2 or poorly implemented
2.0, then it's quite possible that you will see a small speed increase. As
an added bonus, if your VESA implementation has any bugs, UNIVBE will
fix them (when it's running of course).
If your card is quite old, you may see a significant speed improvement
due to the linear addressing and 16/32 bit transfers of the VESA standard.
However, your card must be able to support these operations (though not
necessarily support VESA modes in video BIOS). For cards that do not
support linear addressing, some gains may be realized because in general
the bank switching code of a software VESA driver like UNIVBE is faster
than the implementation in most video card BIOSes.
Q) I have problems with my display card - how do I fix them?
Without the proper technical training, you are limited in what you
can fix, but the most common problems encountered are due to buggy
or incorrect video drivers being installed.
If you are having display problems in Windows 3.1, Windows 95, OS/2,
XFree 86 or just about any other graphics based operating system,
then try the standard VGA or SVGA drivers that come with the
system. If the problems disappear, then the drivers you are
using are either buggy, corrupted, installed incorrectly or are
conflicting with something in your system. The best idea is to
make sure you have the most recent drivers - contact your computer
vendor, video card manufacturer or the Net. When you contact people
with problems, have your computer configuration information
in hand (see posting rules at the start of this FAQ).
If you are relatively certain that you have a hardware or software
conflict, see the section "Are there known conflicts with my video
Q) Why are some of my Windows 3.1 icons black? (Extremely low memory, some icons may not be drawn)?
This isn't really a hardware problem, but pops up often enough to justify
an answer here. It occurs due to a limitation of the way that memory is
allocated in the Program Manager (PM) application. Memory is allocated in
64kB chunks, and any given PM group has a maximum of 64kB to store the
application and working directory paths for each application, icon
positions and application icons. If the 64kB limitation wasn't bad
enough, the program manager does something which causes a problem if you
switch to a higher colour mode. Irrespective of the actual number of
colours present in an icon, the Program Manager allocates enough memory
for that icon _in the current colour mode_. I'll illustrate what this
means with an example:
Since each icon is 32x32 pixels in size, a 256 colour pixmap (that's 8
bit, or one byte per pixel) would require 32x32x1 = 1024 bytes, so
PM would allocate 1024 bytes for that icon. Given a maximum of 64kB
of memory, this would limit us to something less than a total of
64 icons in a PM group (since some memory is used for storing the other
info detailed above). If we switch to 24 bit colour mode, then PM will
automatically allocate 32x32x3 (that's 3 bytes per pixel) = 3072 bytes
_for each icon regardless of how many colours are actually in it_.
If we have a large number of icons (more than about 20) within a single
PM group, then PM won't have enough memory to store all the icon info.
As a result, some icons appear black because there is no icon information
stored for them in the PM.
Unfortunately, there is no work-around that I am aware of for this. The
only solution is to break your PM groups into smaller ones which contain
Q) I have problems with my monitor - how can I fix it?
[From: Sam Goldwasser (email@example.com)]
Advanced Monitor Adjustments and Troubleshooting:
READ AND FOLLOW SAFETY INFO IF YOU REMOVE THE COVER ON YOUR MONITOR.
More detailed repair info on a variety of computer and consumer electronic
equipment is available from:
Note that these are for advanced troubleshooting. We take no responsibility
for personal injury or damage to equipment that may result from inexperienced
or incompetent use of this info.
Most Common Problems:
* Intermittent changes in color, brightness, size, or position - bad
connections inside monitor or inGA connector or cable.
* Ghosts, shadows, or streaks in picture adjacent to vertical edges - faulty
cables/termination, video card problems.
* Magnetization of CRT causing color blotches or other color or distortion
problems - faulty degauss circuitry or location near sources of strong
magnetic fields or electromagnetic interference.
* Monitor not syncing on one or more video scan ranges - monitor may be
incompatible with scan rates, fault in monitor electronics.
* Focus or brightness problems - adjustments needed for focus or
background brightness or defective components.
* Dead monitor due to power supply problems.
Monitor Manufacturing Quality and Cold Solder Joints:
Any intermittent problems with monitors that cause random sudden changes in
the picture brightness, color, size, or position are often a result of
Bad solder joints are very common in monitors due both to poor quality
manufacturing as well as to deterioration of the solder bond after numerous
thermal cycles and components running at high temperature. Without knowing
anything about the circuitry, it is usually possible to cure these problems
by locating all bad solder connections and cleaning and reseating internal
connectors. There may also be bad connections due to poor contact at
the VGA connector or bad quality of the internal cable connections.
Ghosts, shadows, or streaks in picture adjacent to vertical edges:
Complaints about these kinds of problems are very common especially as
the screen resolution and necessary video bandwidth keeps increasing.
Most are due to cable and video termination deficiencies and not actual
The video signals for red, green, and blue (or just a single signal for
monochrome) are sent over cables which are generally 75 ohm transmission
lines. These are coaxial cables that may be combined inside a single
sheath for VGA, SVGA, Macs, and many workstations but may be separate coaxes
with BNC (or other) connectors for other video applications.
Without going into transmission line theory, suffice it to say that
to obtain good quality video, the following conditions must be met:
1. A good quality of cable must be used. This means one in which the
characteristic impedance is close to the optimum 75 ohms, one which has
low losses, and one which has good shielding. For installations
using BNC connectors, a good quality of 100% shielded RG59U is often used.
The BNC connectors must be properly installed or they will contribute
to mismatch problems.
2. Where multiple monitors are to be connected to a single video source,
all wiring is done in a daisy chain fashion. The only taps permitted
are the minimum necessary to connect each monitor to the chain. This
usually means a BNC-T connector or a pair of connectors on the monitor
for each video signal. T connections with cable must be avoided.
3. Only the last monitor in the chain should be terminated in 75 ohms. All
of the others must be set to Hi-Z. Monitors with BNC connectors will
usually have one switch or a switch for each color to select termination.
Monitors for PCs, Macs, and workstations usually have built in
termination and do not offer the choice of Hi-Z. This means that without
a video distribution amplifier, it is not possible to connect multiple
monitors of this type to a single video source with any expectation of a
good quality display.
Failure to follow these rules will result in video ringing, ghosts, shadows,
and other unsightly blemishes in the picture. It is often not possible to
control all aspects of the video setup. The cable is often a part of the
monitor and cannot easily be substituted for a better one. The monitor
may not have properly designed circuitry such that it degrades the video
regardless of the cable and display board quality. The display card itself
may not have proper drivers or source termination.
Ironically, the better the video card, the more likely that there will
be visible problems due to termination. This is due to the very high
bandwidth and associated signal edge rates.
Some examples of common termination problems:
* Overly bright picture with trails following vertical edges, perhaps with
periodic ringing. This is due to a missing termination. Check if the
monitor is set for Hi-Z instead of 75 ohms. If there is no switch, then
the termination may be faulty or the monitor may need an external resistor.
For BNC connectors, plug-on terminations are available.
* Bright ghost images adjacent to vertical lines. This may indicate that
the terminating resistor is greater than the impedance of the cable.
You may be using Ethernet Thinnet cable by accident which is RG58 with
an impedance of 50 ohms.
* Dark picture and ghost images adjacent to vertical lines. This may indicate
that the terminating resistor is too low - multiple monitors on a chain all
set for 75 ohms instead of just the last one. Or, an improper type of cable
such as audio patch cord.
* Fuzzy vertical edges. This may indicate a poor quality cable or a run
which is just too long. For high resolutions such as 1280x1024, the
maximum cable length may be as short as 25 feet or less for poor quality
cable. Better cable or fiber-optic repeaters may be necessary.
* Other similar problems - check cables for defective or improperly installed
connectors. This is especially applicable to cables with BNC or UHF type
connectors which require a kind of artistic talent to assembly properly and
If only 1 or 2 colors (of the R, G, and B) are effected, then look for
improper switch settings or bad connections (bad cable connectors are really
common) on the problem color cables.
A monitor which has a picture that is very dark and cannot be adequately
set with the user brightness and contrast controls may need
internal adjustment of the screen (the term, screen, here refers to a
particular electrode inside the CRT, not really the brightness of the
screen you see, though it applies here), master brightness, or background level
controls. As components age, including the CRT, the brightness will
change, usually decrease. The following procedure will not rejuvenate
an old CRT but may get just enough brightness back to provide useful
functionality for a few months or longer. If the problem is not with the age
of the CRT, then it may return the monitor to full brightness. The assumption
here is that there is a picture but the dark areas are totally black and
the light areas are not bright enough even with the user brightness control
turned all the way up.
In most cases, the cover will need to be removed. The controls we
are looking for may be located in various places. Rarely, there will
be access holes on the back or side.
The controls may be located on the:
* flyback transformer. Usually there is a master screen control
along with a focus control on the flyback transformer. The flyback
(or L.O.P.T. for non-U.S. readers) is the component that generates
the high voltage for the CRT - it has the fat red wire attached to
the CRT with a thing that looks like a suction cup.
* a little board on the neck of the CRT. There may be a master screen
control. a master brightness control, a master background level control,
or individual controls for red, green, and blue background level. Other
variations are possible. There may also be individual gain/contrast
* main video board is less common, but the background level controls may
be located here.
Display a picture at the video resolution you consider most important
which includes both totally black and full white areas which also
includes sharp vertical edges.
Set the user brightness control to its midpoint and the user contrast
control as low as it will go - counterclockwise.
Let the monitor warm up for at least 15 minutes so that components can
If there is a master brightness or background level control, use this to
make the black areas of the picture just barely disappear. Them, increase
it until the raster lines just appear. (They should be a neutral gray.
If there is a color tint, then the individual color background controls will
need to be adjusted to obtain a neutral gray.) If there is no
such control, use the master screen control on the flyback. If it is unmarked,
then try both of the controls on the flyback - one will be the screen control
and the other will be focus - the effects will be obvious. If you did touch
focus, set it for best overall focus and then get back to the section on focus
once you are done here.
If there are individual controls for each color, you may use these but be
careful as you will be effecting the color balance. Adjust so that the
raster lines in a black area are just visible and dark neutral gray.
Now for the gain controls. On the little board on the neck of the CRT
or on the video or main board there will be controls for R, G, and B gain
or contrast (they are the same). If there are only two then the third
color is fixed and if the color balance in the highlights of the picture
was OK, then there is nothing more you can do here.
Set the user contrast control as high as it will go - clockwise.
Now adjust each internal gain/contrast control as high as you can without
the that particular color 'blooming' at very bright vertical edges. Blooming
means that the focus deteriorates for that color and you get a big blotch
of color trailing off to the right of the edge. You may
need to go back and forth among the 3 controls since the color that blooms
first will limit the amount that you can increase the contrast settings.
Set them so that you get the brightest neutral whites possible without
any single color blooming.
Now check out the range of the user controls and adjust the appropriate
internal controls where necessary. You may need to touch up the background
levels or other settings. Check at the other resolutions and refresh rates
that you normally use.
If none of this provides acceptable brightness, then either your CRT
is in its twilight years or there is something actually broken in the
monitor. If the decrease in brightness has been a gradual process over the
course of years, then it is most likely the CRT. As a last resort (untested)
you can try increasing the filament current to the CRT the way CRT boosters
that used to be sold for TVs worked. Voltage for the CRT filament is usually
obtained from a couple of turns on the flyback transformer. Adding an
extra turn will increase the voltage and thus the current making the
filament run hotter. This will also shorten the CRT life - perhaps rather
drastically. However, if the monitor was headed for the dumpster anyhow,
you have nothing to lose.
Slight deterioration in focus can be corrected by adjusting the focus
control usually located on the flyback transformer. Sometimes, this
is accessible externally but usually not. On monochrome monitors, the
focus control, if any, may be located on the main board.
Don't expect to have perfect focus everywhere on the screen. Usually there
will be some degradation in the corners. A compromise can generally be
struck between perfect focus in the center and acceptable focus in the
If the adjustments have no effect, then there is probably a fault in the
focus power supply.
Dead Monitor with Periodic Tweet, Tweet; Flub, Flub, or Whine:
A monitor which appears to be dead except for a once a second or so
tweet or flub usually indicates a fault in the switching power supply - often
a shorted rectifier.
A constant whine may mean a shorted component in the horizontal deflection
circuits or elsewhere.
Smoking is just as bad for monitors as for people and usually more quickly
White acrid smoke may indicate a failed electrolytic capacitor in the
power supply probably in conjunction with a shorted rectifier. Needless to
say, pull the plug at once.
Tubes for all Nations:
[From: Jeroen Stessen, Philips TV-lab Eindhoven NL
CRT Manufacturers actually make different versions of their tubes for
TV's for the northern and southern hemisphere, and sometimes a 3rd neutral
type. These are so-to-say precorrected for the uncompensated field. (Note
that the term 'tube' here includes much of the convergence hardware as
well - not just what is inside the glass.)
I remember when we exported projection televisions from Belgium to
Australia, a couple of years ago. They all had to be opened on arrival
to re-adjust the rotation settings on the convergence panel, due to
the different magnetic field in Australia. Projection TV's don't have
degaussing (there is nothing to degauss), and the customer can only
adjust red and blue shift, not rotation.
Our CRT application group has a "magnetic cage". This is a wooden cube
(approx. 2 meter long sides) with copper coils around each of the 6
surfaces. With this they can simulate the earth magnetic field for
every place on earth (as indicated on a map on the wall).
Magnetic Fields and Degaussing:
[From: Sam Goldwasser ]
Indications of need for degaussing are small or large areas of the screen
where the colors are not correct or where color balance has suddenly
changed. There are other possible causes - both electronic and mechanical -
but stray magnetic fields is numero uno on the list.
The shadowmask or aperture grill of the CRT - the fine mesh just behind
the phosphor screen - is normally made of a material (steel or InVar) which
is easily magnetized. This can happen just by rotating the monitor on its
swivel, by moving it from one place to another, by switching on or off
some piece of electronic equipment near the monitor, even by a local
Since any stray magnetism affects the color purity and convergence, it is
important that the CRT is demagnetized before use.
Degaussing (demagnetizing) a CRT:
Degaussing may be required if there are color purity problems with the
display. On rare occasions, there may be geometric distortion caused
by magnetic fields as well without color problems. The CRT can get
* if the monitor is moved or even just rotated.
* if there has been a lightening strike nearby. A friend of mine
had a lightening strike near his house which produced all of the
effects of the EMP from a nuclear bomb.
* If a permanent magnet was brought near the screen (e.g., kid's
magnet or megawatt stereo speakers).
* If some piece of electrical or electronic equipment with unshielded
magnetic fields is in the vicinity of the monitor.
Degaussing should be the first thing attempted whenever color
purity problems are detected. As noted below, first try the
internal degauss circuits of the monitor by power cycling a few
times (on for a minute, off for 30 minutes, on for a minute, etc.)
If this does not help or does not completely cure the problem,
then you can try manually degaussing.
Commercial CRT Degaussers are available from parts distributors
like MCM Electronics and consist of a hundred or so turns of magnet wire
in a 6-12 inch coil. They include a line cord and momentary switch. You
flip on the switch, and bring the coil to within several inches of the
screen face. Then you slowly draw the center of the coil toward one edge
of the screen and trace the perimeter of the screen face. Then return to
the original position of the coil being flat against the center of the
screen. Next, slowly decrease the field to zero by backing straight up
across the room as you hold the coil. When you are farther than 5 feet
away you can release the line switch.
The key word here is ** slow **. Go too fast and you will freeze the
instantaneous intensity of the 50/60 Hz AC magnetic field variation
into the ferrous components of the CRT and may make the problem worse.
It looks really cool to do this while the CRT is powered. The kids will
love the color effects.
Bulk tape erasers, tape head degaussers, open frame transformers, and the
"ass-end" of a weller soldering gun can be used as CRT demagnetizers but
it just takes a little longer. (Be careful not to scratch the screen
face with anything sharp.) It is imperative to have the CRT running when
using these wimpier approaches, so that you can see where there are
still impurities. Never release the power switch until you're 4 or 5
feet away from the screen or you'll have to start over.
I've never known of anything being damaged by excess manual degaussing
though I would recommend keeping really powerful bulk tape erasers turned
degaussers a couple of inches from the CRT.
If an AC degaussing coil or substitute is unavailable, I have even done
degaussed with a permanent magnet but this is not recommended since it is more
likely to make the problem worse than better. However, if the display
is unusable as is, then using a small magnet can do no harm. (Don't use
a 20 pound speaker or magnetron magnet as you may rip the shadowmask right
out of the CRT - well at least distort it beyond repair. What I have in
mind is something about as powerful as a refrigerator magnet.)
Keep degaussing fields away from magnetic media. It is a good idea to
avoid degaussing in a room with floppies or back-up tapes. When removing
media from a room remember to check desk drawers and manuals for stray
It is unlikely that you could actually affect magnetic media but better
safe than sorry. Of the devices mentioned above, only a bulk eraser or
strong permanent magnet are likely to have any effect - and then only when
at extremely close range (direct contact with media container).
All color CRTs include a built-in degaussing coil wrapped around the
perimeter of the CRT face. These are activated each time the CRT is
powered up cold by a 3 terminal thermister device or other control
circuitry. This is why it is often suggested that color purity problems
may go away "in a few days". It isn't a matter of time; it's the number
of cold power ups that causes it. It takes about 15 minutes of the power
being off for each cool down cycle. These built-in coils with thermal
control are never as effective as external coils. An exception is the
type in the better workstation CRTs that include a manual degauss button.
Note that some manufacturers warn of excess use of these buttons due to their
designs (read: inferior) where certain components like the coil or control
circuits may overheat. It has nothing to do with excess degaussing - just
excess use of their degauss circuitry.
How Often to Degauss:
Some monitor manufacturers specifically warn about excessive use of degauss,
most likely as a result of overstressing components in the degauss circuitry
which are designed (cheaply) for only infrequent use. In particular,
there is often a thermister that dissipates significant power for the second
or two that the degauss is active. Also, the large coil around the CRT
is not rated for continuous operation and may overheat.
If one or two activations of the degauss button do not clear up the color
problems, manual degaussing using an external coil may be needed
or the monitor may need internal purity/color adjustments. Or, you may have
just installed your megawatt stereo speakers next to the monitor!
You should only need to degauss if you see color purity problems
on your CRT. Otherwise it is unnecessary. The reasons it only works the
first time is that the degauss timing is controlled by a thermister
which heats up and cuts off the current. If you push the button
twice in a row, that thermister is still hot and so little happens.
One word of clarification: In order for the degauss operation to be
effective, the AC current in the coil must approach zero before the
circuit cuts out. The circuit to accomplish this often involves a
thermister to gradually decrease the current (over a matter of several
seconds), and in better monitors, a relay to totally cut off the current
after a certain delay. If the current was turned off suddenly, you would
likely be left with a more magnetized CRT. There are time delay elements
involved which prevent multiple degauss operations in succession. Whether
this is by design or accident, it does prevent the degauss coil - which is
usually grossly undersized for continuous operation - to cool.
One of the most common complaints is that the monitor is not as crisp as
it used to be - or just not as sharp as expected.
Assuming that the focus has just been gradually getting worse over time,
tweaking the internal focus control may be all that is needed.
On most monitors, the flyback transformer includes two control - FOCUS and
SCREEN. The one you want is, of course, FOCUS.
Safety: as long as you do not go near anything else inside the monitor while
it is on AND keep one hand in you pocket, you should be able to do this without
a shocking experience.
Plug it in, turn it on and let it warm up for a half hour or so. Set your
PC to display in the resolution you use most often. First turn the
user brightness and contrast fully counterclockwise. Turn brightness up until
the raster lines in a totally black area appear, then back a hair until
they disappear. Then, turn the contrast control up until you get a fairly
bright picture. Fully clockwise is probably OK. Adjust FOCUS
for generally best focus. You will not be able to get it razor sharp
all over the screen - start at the center and then try to get the
edges and corners as good as you can without messing up the center too much.
Double-check that the focus is OK at your normal settings of brightness and
The SCREEN control adjusts background brightness. If the two controls are
not marked, you will not do any damage by turning the wrong one - it will
be immediately obvious as the brightness will change rather than focus
and you can then return it to its original position (or refer to the section
on brightness adjustments to optimize its setting).
Interference from Electrical Wiring:
If the wiring of normal outlets is done correctly even without a safety
ground, the currents should be balanced and you will not experience a problem.
However, many circuits, particularly those involving setups like 3-way
switches or switched outlets and wiring in older buildings can have
unbalanced currents when active. If your monitors are close enough
to the wiring, there can be interference which will take the form of
a flickering or pulsating display.
Other than recommending moving the monitors, there is no easy solution.
They can be shielded with Mu Metal but that is expensive. Or you could
run all displays at a 60 Hz vertical rate (or 50 Hz depending on where
you live). However, this is inconvenient and will never be quite perfect.
Interference from other Equipment:
Any type of equipment which uses or generates strong magnetic fields can
interfere with a monitor. Other computer monitors or TVs, equipment with
power transformers, and electric motors will cause a pulsating or flickering
display. Loudspeakers or other equipment with static magnetic fields will
cause color purity and/or geometric distortion problems which degauss will
The easiest way to confirm that interference is your problem is to move
the monitor or suspect equipment to a different location. The only real
solution is to separate the monitor and interfering device.
Contour Lines on High Resolution Monitors:
These fall into the category of wavy lines, contour lines, or light and dark
bands even in areas of constant brightness. These may be almost as fine
as the dot pitch on the CRT or 1 or 2 cm or larger and changing across the
screen. If they are more or less fixed on the screen and stable, then
they are not likely to be outside interference. (However, if they are locked to
the image, then there could be a problem with the video board.)
One cause of these lines is Moire (interference patterns) between the
raster and the dot structure of the CRT. Ironically, the better the focus
on the tube, the worse this is likely to be. Trinitrons, which do not
have a vertical dot structure should be immune to interference of this sort
from the raster lines (but not from the horizontal pixel structure).
You can test for Moire by slowly adjusting the vertical size. If it is Moire,
you should see the pattern change in location and spatial frequency as slight
changes are made to size. Changes to vertical position will move the patterns
without altering their structure - but they will not remain locked to
the moving image.
The patterns will remain essentially fixed in position on the face of the
CRT for horizontal size and position adjustments - the patterns will
remain fixed under the changing image.
How to eliminate it? If Moire is your problem, then there may be no easy
answer. For a given resolution and size, it will either be a problem or
not. You can try changing size and resolution - Moire is a function
of geometry. Ironically, I have a monitor which is nicer in this respect
at 1024x768 interlaced than at 800x600 non-interlaced.
Another cause of similar problems is bad video cable termination
creating reflections and ghosting which under certain conditions can be so
severe as to mimic Moire effects. This is unlikely to occur in all colors
with a VGA display since the termination is internal to the monitor.
Monitor Reliability with SVGA:
There are parts in the monitor which may get hotter with SVGA but if it is
designed for SVGA resolution, there should be no problem (assuming you are
not running in an excessively hot room or with the ventilation holes covered).
A good quality multisync monitor should not mind switching screen resolutions
frequently (though doing it every few seconds continuously may stretch this
Newer multisync monitors should also be smart enough not to blow up if
you feed then a scan rate which exceeds their capabilities. However,
there are a lot of poorly designed monitors out there.
If it is supposed to run SVGA, use it at SVGA. If it blows up,
switch to a different brand. There are a lot of crappy monitors being
sold on their own and bundled with PCs.
CRT Replacement - Probably not worth it:
The sad fact is that even if you can obtain a new CRT, the cost to replace
and make the needed color and geometry adjustments will likely be prohibitive.
As noted in the section on Monitor Life - the CRT is the heart of the
monitor, preserve it by turning the monitor off When not in use for an
extended period of time. Screen savers do not help.
Since components do change value when they warm up, some minor change in
position and size may be expected. How much drift occurs really
depends on many factors including the basic design, quality of components,
ventilation/cooling, etc. Of course, it is possible to have a monitor that
has a component that is worse with respect to temperature. Could also
be related to line voltage depending on the regulation of your monitor's
In general, my feeling is that if it is not objectionable (a 1/2" shift
would be objectionable) AND it's severity is not changing with time, you
can ignore it.
Many monitors do this. TVs do this but you are not aware of it since they
are already 5-10% overscanned for just this reason, as well as compensating
for component aging and line voltage fluctuations.
Q) Are there known conflicts with my video card?
[From: Michael Scott (firstname.lastname@example.org)]
Overclocking VLB to >40 MHz
If your motherboard operates at 50 MHz, it's quite possible that you
will have trouble with VLB video cards. The VESA specification
states that, at best, one card can operate at 40 MHz, or two can
operate at up to 33 MHz. Some manufacturers don't even guarantee
that their cards will run at 40 MHz, preferring to support bus
speeds of 33 MHz or less. I am unaware of _any_ vendor who will
guarantee that their VLB video card will work at >40 MHz. So, if
your VLB video card, running at >40 MHz, is causing problems, your
best bet is to step your bus speed down. As an alternative, try
another model or brand of card.
[From: Dylan Rhodes (Formerly of Hercules) ]
Version 2.0 of the VESA VL-Bus specification added support for a
50MHz bus speed. However, VESA VL-Bus 2.0 is one of a few VESA specs
which went largely unimplemented by manufacturers. Just because the
VL-Bus 2.0 spec exists does not mean that all VL-Bus motherboards
manufactured since day one are now compatible with this new spec.
[From: Michael Scott (email@example.com) ]
IBM SLC Motherboards
Some VLB video cards will not operate properly in some 486slc
motherboards. Implementation of the 32 bit VLB with the 16 bit external
data path of this CPU was problematic on early incarnations. For the
most part, this was because of poor implementations of VLB on the
motherboards, not a video card problem. Later versions of these
motherboards overcame these problems, but if you have an older one you
may not be able to run some VESA video cards on it.
VLB and Memory Aperture
If you have a VLB system and your video card uses a memory aperture,
ensure that your system has adequate address space. Memory aperture
works by reserving linearly mapped address space, usually at high
addresses (120Meg+) which corresponds to the memory on the video card.
As a result, large linear memory transfers can be done without
resorting to regular VGA memory address segmentation. As a result,
your system has to have more memory address space than physical memory,
or there will be conflicts between the memory aperture and physical RAM.
i.e. system RAM + video RAM <= maximum addressable RAM
For example, a system with 16 Meg of RAM that can address 128 Meg can
have a memory aperture at 120 Meg, for up to 8 continuous megabytes.
However, if your system is a 486slc which has only 24 bit addressing,
it can only address 16 Meg of RAM. In this case, the memory aperture
must be located at <16 Meg (usually 12 Meg) so your total system RAM
can't exceed 12 Meg if you wish to take advantage of the speed increases
of using a memory aperture.
IBM's 8514/a and COM4
[ From: Michael Scott (firstname.lastname@example.org) and Dylan Rhodes
(Formerly of Hercules) and Jim at Hercules]
The 8514/a was designed to coexist with a VGA adapter, and for this
reason it uses a different range of addresses. Some of these are 16-bit
addresses which are located at h42E8, h82E8, h92E8, hA2E8 & hE2E8.
Unfortunately, many cheapo serial controllers only decode the first 12
bits of the I/O port addresses, and assume that calls to x2E8 (like all
of those listed above) are intended for the serial port rather than the
video card. This means that COM4 cannot be used on a machine with an
8514/a compatible video card _unless_ the address of COM4 can be changed
(usually via jumpers) on the serial card, or the serial controller
decodes all 16 bits of the I/O port addresses. There is no other way
to get COM4 and any 8514/a compatible display adapter to coexist.
Note that this is _not_ a shortcoming of 8514/a, but is rather a
limitation of most serial controllers.
ATI Mach and S3 Vision/Trio cards and COM4
[ From: Michael Scott (email@example.com)]
ATI's Mach and S3's current chipsets were based on IBM's 8514/a standard
and have the same problems as the 8514/a. See 'IBM 8514/a
http://www.hercules.com/knowbase/ : Do a search on COM4 and Terminator.
You are looking for item 575 in the knowledge base.
ATI Mach64 cards + Quicktime for Windows 3.1 = GPF
GPF (General Protection Faults) are all too common in Windows 3.1. In
this case, the fix is easy. Simply edit your system.ini file, and under
the [macx] heading, add the following line:
DeviceBitmaps=off. Games (like Myst) or other programs that use
Quicktime for Windows 1.1 will require this fix.
If editing your system.ini file makes you nervous, try the following:
[Roger Squires (firstname.lastname@example.org)]
Go into the ATI FlexDesk, type " OPT"
(brings up hidden window) and uncheck DeviceBitmap
If you have any other tips or fixes for other boards or chipsets,
please submit them to Michael Scott (email@example.com).
Video Circuitry Integral with the Motherboard
[Michael Scott (firstname.lastname@example.org)]
If you're installing a new video card into an existing system that has
video circuitry integral with the motherboard, you will have to disable
the built-in video. Otherwise you will have conflicts between the
new video card and existing circuitry - they will try to both use the
same VGA address space.
If none of the above apply to you, then either talk to a professional,
or if you're a bit knowledgeable, you might try the following.
There are some general things to consider when you suspect that there
may be a hardware conflict between your video card and another part of
your system. The odds are that the conflict is due to either another
add-in card or a TSR (Terminate and Stay Resident) program. To be
able to determine this yourself, you have to know a little bit about
pc hardware and software configuration. In general, the following
procedure should help you to isolate the cause of your frustrations:
First, make sure it isn't a software conflict. This example is for
DOS users. Start by creating a boot floppy by getting to a command
prompt, putting a blank floppy into floppy drive A: and typing:
format /s a:
This will transfer the basic system files to the floppy. After this,
copy the absolute minimum TSR's onto the floppy, and put a bare-bones
config.sys and autoexec.bat on it. Take out sound card drivers,
cdrom drivers, RAM disks and anything else superfluous. Reboot the
computer with the floppy in, and see if the problem persists.
If not, incrementally add your TSR's back in until the problem
appears. At this point you know what is causing the conflict, and
can go about trying to get a new driver or configuring the existing
If the problem is still there, then the problem is in hardware.
The same basic approach works here. After your computer is shut
off, take the case off the back. You should ground yourself to
the computer's chassis (if metal) or power supply to avoid blasting
any of your add-in cards with static electricity. Remove all
but the most necessary cards - usually this means the video adapter
and i/o adapter are the only cards remaining. Reboot the system
with the minimal TSR's loaded and check for the problem. If it still
persists, and you have determined that a software conflict does _not_
exist, then your video card may be incompatible with your motherboard.
If the problem disappears, incrementally add your other cards back
into the machine until you find the offending card. Once you find
it, check the configuration of that card. Ensure that it isn't
using the same memory address space or interrupts that the video
Q) What are MDA, Hercules, CGA and EGA adapters?
Monochrome Display Adapter (MDA)
This was the first display adapter shipped with the IBM PC, and
was only capable of displaying text, at an effective pixel addressability
of 720x350. The MDA provides crisp monochrome text at a low vertical
refresh rate of 50 Hz, and a 18.43 kHz horizontal refresh rate.
Hercules Graphics Card
This adapter, introduced by Hercules Computer Technology, Inc.,
provided MDA compatibility and extensions for graphics at 720x348 pixel
addressability. Due to its popularity, several other vendors released cards
with Hercules compatibility modes, but unfortunately few are 100% compatible.
Color Graphics Adapter (CGA)
The CGA, released by IBM, supports 4 colours in graphics mode and
8 in text mode at a pixel addressability of 640x200. The CGA provides a
vertical refresh of 60 Hz with a horizontal refresh of 18.43 kHz. This
limited pixel addressability results in text which is considerably worse
than that provided by the MDA. An additional problem is that processor
access to the CGA interferes with screen refreshes, causing 'snow' on the
monitor. This results in an irritating flicker in some programs.
Enhanced Graphics Adapter (EGA)
The next offering by IBM has a pixel addressability of 640x350
and offers the display of 16 colours out of a palette of 64. It offers
backwards compatibility with the CGA. EGA displays (Enhanced Colour
Displays) have a 60 Hz vertical refresh rate and horizontal refresh rates
of 15.75 or 21.8 kHz.
For information on which displays are compatible with which adapters,
refer to "What monitors will work with my MDA/Hercules/CGA/EGA card?".
Q) What monitors will work with my MDA/Hercules/CGA/EGA card?
The wide variety of displays available makes a comprehensive list
unmanageable. However, a list of display types for PC compatible video
adapters is included below. Your best bet to determine compatibility
between your video card and a given display is to find out what the
equivalent IBM display is and refer to the chart below.
Display Compatible Colours Text Graphics Scan
Adapters Res. Resolution Rates
Monochrome MDA 640x350 Vert-50 Hz
Hercules 2 80x25 720x350 Hor-18.43 kHz
Color CGA 16 40x25 320x200 V-60 Hz
EGA 80x25 640x200 H-18.43 kHz
Enhanced CGA 16 of 40x25 320x200 V-60 Hz
Color EGA 64 80x25 640x200 H-18.43 kHz
640x350 or 21.8 kHz
Multisync CGA 16 of 40x25 320x200 Variable
digital EGA 64 80x25 640x200
Multisync VGA 256 of 80x25 640x480 Variable
analog 256k 800x600
VGA Color VGA 256 of 40x25 320x400 V-70 Hz
display 256k 80x25 640x400 H-31.5 kHz
VGA Mono 320x350
Q) What is VGA, and how does it work?
OK, the answer to this one could easily be a book (actually, see the
references because it _is_ a book or several). I'll give a very cursory
overview of what the VGA is capable of.
The Video Graphics Array is a standard established by IBM to provide
higher pixel addressability, colour graphics than are available with EGA.
In fact, VGA is a superset of EGA, incorporating all EGA modes.
The VGA consists of seven sub-systems, including: graphics
controller, display memory, serializer, attribute controller,
sequencer and CRT controller. Basically, the CPU performs most
of the work, feeding pixel and text information to the VGA.
Graphics Controller: Can perform logical functions on data being
written to display memory.
Display Memory: A bank of 256k DRAM divided into 4 64k colour planes.
It is used to store screen display data.
Serializer: Takes display data from the display memory and
converts it to a serial bitstream which is sent to the
Attribute Controller: Contains the colour LUT (Look Up Table) which
determines what colour will be displayed for a given pixel
value in display memory.
Sequencer: Controls timing of the board and enables/disables
CRT Controller: Generates syncing and blanking signals to control
the monitor display.
It is beyond the scope of this FAQ to describe the functionality of
these components in detail, so for further reading consult Sutty &
Blair (see References).
VGA provides very low-level graphics commands. This, combined
with the fact that a VGA card has a frame buffer but no real
processing power, means that the PC's CPU has to do most of the graphics
number crunching. As a result, the VGA speed of a given computer is
highly dependent on the CPU speed, and the two cannot be uncoupled.
Basically this renders VGA speed comparisons between video cards installed
in systems which use different processors meaningless. Also, the VGA
performance of a video card _can not_ be used to estimate how fast that
card will be in another video mode (i.e. SVGA, Windows 3.1, etc).
VGA is really an outdated standard, but in fact, all PC's today boot in
VGA text mode 7 (see table below) and there is no indication that this
will change in the near future. Most DOS games still use it because of
its universality. While most GUI users think that 800x600 is a minimum
pixel addressability, most DOS games only use a 320x200 pixel mode. Now,
a number of SVGA games (640x480 with >16 colours or higher resolutions)
are being released. However, the larger number of pixels which are being
displayed require a faster processor and sometimes even a fast Pentium
can appear sluggish.
The VGA modes are:
Mode Type Resolution Chars Colours
0,1 text 360x400 40x25 16
2,3 text 720x400 80x25 16
4,5 gfx 320x200 40x25 4
6 gfx 640x200 80x25 2
7 text 720x400 80x25 mono
D gfx 320x200 40x25 16
E gfx 640x200 80x25 16
F gfx 640x350 80x25 mono
10 gfx 640x350 80x25 16
11 gfx 640x480 80x30 2
12 gfx 640x480 80x30 16
13 gfx 320x200 40x25 256
The next 'standard' (and hopefully it will be widely adopted), is
VESA SVGA, and provides standard SVGA modes (pixel addressabilities &
colour depths), registers and refresh rates.
END of comp.sys.ibm.pc.hardware.video FAQ - Part 3/4
Michael J. Scott R.R.I., U of Western Ontario
email@example.com 'Need a good valve job?'
PC Video Hardware FAQ: http://www.heartlab.rri.uwo.ca/videofaq.html
############### Illegitimus non tatum carborundum. ##############
Your Great New Jersey Web Site