Saturday, October 12, 2013

320x200 : The Resolution of Choice for the IBM PC

IBM decided that its Color/Graphics Adapter would support a 320x200 pixel resolution in its "medium" resolution graphics modes.  In its 40-column text mode, the 8x8 character cell would given an equivalent resolution with 25 rows on the screen. The 16KB CGA card could only display 4 colors on the screen at one time from a 16 color palette, barring graphics tricks.  It also supported a high resolution graphics at 640x200 pixels with one color freely selectable, but this was comparatively seldom used except when games turned it into a 160x200 pixel mode using color composite graphics.  Its 80-column text mode would, with 25 rows and an 8x8 text box, correspond to the 640x200 pixel resolution.

IBM did not offer a color display at the launch of the PC.  It was assumed that most users would connect the CGA card directly to a color composite monitor or to a TV via an RF modulator.  While NTSC-standard color displays could support up to 240 visible lines without interlacing, a large portion of the visible area of the screen on these devices could be obscured by the physical shell surrounding the glass monitor.  Previous home computers from Apple and Atari only supported 192 pixels as a result.  IBM's 200 pixels was hardly likely to tax the newer displays of the 1980s, which showed a more rectangular viewing area than the more circular TV screens of earlier decades.  In 1983 IBM released its 5153 PC Color Display, which provided official RGBI support from the Corporation.  This monitor had a vertical size control, which could accomodate 240 lines quite easily.  The CGA card hardware and 16KB of RAM, could not.

Jill of the Jungle CGA 320x200x4 Mode
The next widely-used graphics advances were the PCjr.'s Enhanced CGA graphics, later cloned by Tandy and known then as Tandy graphics.  This also primarily used a 320x200 pixel graphics mode with 16 colors available, but also supported a true but seldom used 160x200x16 low resolution graphics mode and a very rarely used 640x200x4 graphics mode.  IBM's PCjr. was only CGA compatible at the BIOS level, but Tandy's Graphics Adapter was CGA compatible at the register level.  The PCjr. and Tandy graphics adapters took from 16-32KB of system RAM for its graphics RAM depending on the mode.

IBM's EGA card also supported a 320x200x16 graphics mode, and this was by far the most frequently used graphics mode in EGA-supporting games.  The EGA could also support a 640x200x16 graphics mode and was backwards compatible with CGA at the BIOS level.   Tandy Graphics and EGA graphics would almost invariably look the same, but the hardware was very different.  Tandy's Enhanced Graphics Adapter, introduced with the Tandy 1000TL and 1000SL also supported a 640x200x16 mode, but few programs used it as it was not EGA compatible.  Amstrad's CGA adapter also supported a unique 640x200x16 mode, but few programs used it.

Jill of the Jungle EGA 320x220x16 Mode
All the above graphics modes worked on the same type of monitor, a digital TTL RGB monitor only capable of selecting sixteen colors.  This monitor supported the same NTSC horizontal (15.75KHz) and vertical (60Hz) scan rates of the television set.  For the EGA, IBM also included support for a 640x350 line mode with 16 colors selectable out of a 64 color palette.  This mode only worked on a special color monitor, the 5154 PC Enhanced Color Display, which supporting a higher horizontal scan rate (21.8KHz) and the ability to select 64 colors through digital TTL RGB inputs.  The standard IBM EGA card only came with 64KB, but the 640x350x16 mode required a 128, 192 or 256KB of RAM on the card.  Most clone cards came with 256KB standard.

In 1987, IBM introduced the VGA and new corresponding monitors.  VGA supported a 320x200x256 graphics mode with a palette of 262,144 colors available.  This was the mode most frequently used by games. VGA was backwards compatible with EGA at the register level and CGA at the BIOS level.   It also supported a 640x480x16 mode, but far fewer DOS games used it.  Windows 3.0 and above would use it for its default graphics display.  A new monitor was required to display the much larger color palette of the VGA compared to the CGA and EGA.  Analog color monitor outputs were used.  The high resolution display supported a 31.5KHz scan rate and 70Hz vertical refresh rates for all VGA modes, including emulated modes, except for the 640x480 mode, which used 60Hz.  200-line modes would be double scanned, with each pixel being double-clocked and each vertical line being repeated to fill up the refresh rate.  This gives a different kind of scan-line structure compared with earlier monitor.

Jill of the Jungle VGA 320x200x256 Mode
By the time of VGA, the 320x200 resolution had found support in many non-IBM PC compatible home computers.  The Commodore 64 used a 320x200 resolution and a derived 160x200 resolution.  The Atari ST, Commodore Amiga and Apple IIgs all used a 320x200 (and to a far lesser extent 640x200) resolution with varying degrees of color and palette support.  While the PAL Amiga supported 256 lines by default, most games used 200 lines for extra speed.  Those squished screenshots of Amiga games (compared to other systems) display that way on PAL machines.

Most VGA games only supported 320x200x256 graphics mode.  The BIOS mode, Mode 13h, was easy to program for but somewhat limited.  Eventually programmers found out how to create custom resolutions by using the VGA hardware registers, the so-called Mode X.  Mode X typically comprised of 320x240 pixels, which gave square pixels.  Epic Pinball and The Last Vikings used this mode.  Some games used a 320x400 graphics mode, which was easy to obtain on VGA hardware.  Programmers had to be careful to ensure that their custom mode would be compatible with the wide variety of VGA adapters in the marketplace.  Standard 256KB VGA can support any combination of 320 or 360 horizontal pixels by 200, 240 or 350 vertical pixels with 256 colors.

I have included screenshots of Jill of the Jungle above.  Jill supports all 320x200 in all three color modes.  The game does not support 320x200x16 graphics on an IBM PCjr. or Tandy 1000 Graphics adapter (few if any shareware games supported the unique graphics modes these adapters), it will use the 320x200x4 mode instead.  Except for Jill's face, the graphics are virtually identical, pixel-wise, across the three modes.  Many games down-convert the graphics using an algorithm to eliminate the need to have two extra sets of graphics images or tiles on the disk.

320x200 has a 1.6 pixel aspect ratio.  To get truly square pixels, a 4:3 display must have letterboxing.  A 16:10 1280x800, 1920x1200 or 2560x1600 widescreen monitor can display the resolution perfectly using nearest-neighbor interpolation.  However, when the resolution was used all displays were 4:3, and most users would stretch the 200 vertical lines to fill up the screen.  Instead of perfectly square pixels, you would get pixels 1.2 times as vertical compared with the horizontal width on a monitor where the vertical width had been stretched to the edges of the monitor.  Most graphic artists assumed this and adjusted their graphics accordingly, but not all did.

An illustrative example.  Look at this screenshot of Elite Plus, using the VGA 320x200x256 graphics mode.


You can see that the circle is a circle in the 1.6:1 aspect ratio.  But when converted to a 4:3 aspect ratio :



The circle has become an oval.  Thus it would seem that the 1.6:1 aspect ratio is correct for this game.

Let's look at another game, LOOM.  Here is a screenshot with a clearly spherical object in it :


Looks a bit squat in the 1.6:1 aspect ratio.  If we stretch the aspect ratio :


Now the crystal ball looks like a sphere in a 4:3 aspect ratio.  Click the 4:3 images for an undistorted, pixel-perfect resize but huge (1600x1200) version of the screenshot.

Even when Windows 95 was released, most graphically intensive games for the PC were still being released for DOS.  Only in 1997, with the acceptance of 3D accelerators, DirectX and the undeniable dominance of the Windows platform did high-performance games finally require Windows.  Most games up to this point either only supported 320x200 (DOOM, Daggerfall) or offered it as the default resolution (Duke Nukem 3D, Quake).  SVGA was not well-supported because each chipset had its own way of offering higher resolution modes, and by the time VESA modes were widely supported, Windows 95 was the gaming OS of choice.  At this point, 640x400x256 and 640x480x256 graphics were the norm.

8 comments:

  1. I always look forward to your posts for the breadth of information you provide.

    ReplyDelete
  2. "must have letterboxing" -- I think you meant pillarboxing.

    ReplyDelete
    Replies
    1. No, I meant letterboxing. Letterboxing is the standard aspect-ratio correct method to show widescreen material on a full screen display. 320x200 with square pixels is a widescreen mode. Pillarboxing is the standard aspect-ratio correct method to show full screen material on a widescreen display.

      Delete
  3. FWIW, Amiga games using 320x200 was mainly to cater to NTSC Amigas, which used 320x200 with 4:3 screen aspect ratio but 8:9 pixel aspect ratio vs. PAL Amigas which used 320x256 displays with 4:3 screen aspect radio but 16:15 pixel aspect ratio. See e.g. this reference.

    So it wasn't actually primarily some sort of speed hack, though technically did have minor advantages there as bitplane DMA could end earlier freeing up more chip memory bandwidth for other things (the Amiga had a partially-UMA system with the cpu and several gfx/sfx/io coprocessors sharing a memory bank. Memory used by all processors was called "chip memory" (used by the custom coprocessor chipset as well as the CPU), memory accessible to the CPU only was called "fast memory" (as it was faster from the CPU's perspective) - a lot of amigas only had chip memory out-of-box. Different generations of Amiga have different maximum chip memory sizes, from 512kiB to 2MiB (with nearly-but-not-quite-released Amiga designs going to 8MiB so some emulators support that).

    PAL Amiga users generally kind of hated they way some Amiga games would just use 320x200-only as it would be letterboxed and slightly distored looking on their displays. Good games developers sometimes allowed proper switching. OTOH, european Amiga games might be 320x256-only and the NTSC guys would get a cut-off display (but the Amiga was far, far more popular in Europe than America of course...)

    The Amiga had crazy powerful (for the era) programmable gfx hardware including quite a bit of ability to adjust actual pixel/scanline timings, and even sync to an external clock signal. This made them very popular in the video processing world of 1980s and 1990s TV/Movie industry as they could genlock and chroma-key with (relatively) inexpensive hardware add-on (like the famous "Video Toaster")

    A lot of amiga modes used very-non-square (roughly 2x1) pixel aspect ratios to have more horizontal resolution. And of course the cursed flickery interlacing (showing alternate scanlines every frame) for simulated double vertical resolution. Ultimately most modes ended up with an actual screen aspect ratio of 4:3 even if they were something deeply weird like 1280x256.

    The Amiga could also be coaxed into "overscanning" so that it filled the display right to the edges of the PAL or NTSC video signals instead of being a rectangle with borders. It did this by adding more pixels (e.g. 704x484 NTSC / 704x576 PAL), not making the pixels bigger, so detail wasn't lost.

    I think it might have also been able to output analog-era-widescreen-tv-suitable signal, but hazy on that.

    ReplyDelete
  4. I miss my Amiga 1000 and Guru Meditation!

    ReplyDelete
  5. Nice! Now I understand why I can see a perfect screen ratio playing DOSBOX MK II on my LG 16:10 monitor in 1440x900 res.

    ReplyDelete
  6. I think I've found a workable "perfect" resolution based on this post that should work for .... 16K resolution monitors.

    9600x7200 manages to fill the vertical space of a 16K 16:9 monitor by 83%. This is close enough to be playable without too much issue. At every lower resolution, the nearest "perfect" 5:6 pixel height resolution takes up much closer to 50% of the vertical height. Since 16:9 appears to be the aspect ratio of choice for the foreseeable future, and 16K is pretty much at the upper limit of what our eyes can discern even at close range, this looks to be the best compromise we're going to get.

    ReplyDelete
  7. The main reason why 320x200 pixels at 8 Bit color depth was used was the addressable RAM window of 128 KiB between the memory addresses 640 and 768 of the 8086 CPU.
    Because the 16 Bit wide address register limited the memory segment to 2^16 = 64 KiB RAM and because double buffering allowed to draw the pixels in the background while the data of other segment was displayed you have to divide these 128 KiB by two and use. Thus there was only a 64 KiB window for each buffer available.
    And with 64 KiB this allows you to use a resolution of (64 KiB * 1024) / (320) = 204 pixels which was simplified to 200 pixel. The 320*4 Pixel that wasn't used was 1280 Bytes extra RAM that could be used for Sprite data.

    ReplyDelete