The Average Website Is Now As Big As Doom

Megalith

24-bit/48kHz
Staff member
Joined
Aug 20, 2006
Messages
13,000
“Doom is a multi-level first person shooter that ships with an advanced 3D rendering engine and multiple levels, each comprised of maps, sprites and sound effects,” but “2016's web struggles to deliver a page of web content in the same size. If that doesn't give you pause, you're missing something."

The average web page is now roughly the same size as the full install image for the classic DOS game Doom, apparently. This is according to Ronan Cremin, a lead engineer with Afilias Technologies and dotMobi's representative for the W3C (World Wide Web Consortium). Cremin points to data from the HTTP Archive showing that, at 2.3MB, the average page is now the same size as the original DOS install of the id Software mega-hit. The HTTP Archive report places the average web page at around 2,301KB. This is smaller than Doom's 2,393KB footprint, but only slightly.
 
“Doom is a multi-level first person shooter that ships with an advanced 3D rendering engine and multiple levels, each comprised of maps, sprites and sound effects

I hope you are talking about the current Doom, cause DOS Doom absolutely was not 3D, and that's actually one of the most badass testaments to it's cleverness.
 
I think it's a false comparison: they should be comparing Doom's source code to the website code.

Doom was also written in an era when every byte counted and graphics were 256 bit. Nowadays people think nothing of splurging big graphics images and nobody cares about code size.
 
I think it's a false comparison: they should be comparing Doom's source code to the website code.

Doom was also written in an era when every byte counted and graphics were 256 bit. Nowadays people think nothing of splurging big graphics images and nobody cares about code size.

The real irony is that people get sloppy when there's no restrictions in place. it's only through restraint and limitations do we truly excel.
 
Actually I am laughing at really big issue that got ignored so doom ran on a cga/vga displays that is 320×200 pixel mode with four colors/sixteen colors... that is 64K pixels in total 8bit gaming is 256 colors... and a few years later but the memory foot print is at the best one bit or one hexadecimal value. that is where the eight bit term comes from eight of those stored together. a byte is eight bits... so eight times eight is sixty four so frame buffer would require 64 thousand bits... or 8 thousand bytes or 8KB. a png can store these same levels with lazcos where you have a 10k image the same size but on today's monitors that is a site avatar not a graphic you are looking at... most of the web sites 300 by eight hundred likely based on the size of doogles... and small graphics 300 by 100 anything smaller on 19x12 screen or phone is not clear enough to see... but those all take up more space... than the whole screen being 320x200 I remember the screens back then, you did not think you needed more, I remember having issues drawling some of the blueprints early on but not because of the resolution... it was more wrapping my head around the concept of drawling only certain things on certain layers then looking at the top down view and trying to make heads or tails of the layers together...

I looked at my pages and most are in kilobytes still even with images on them but if websites are worried about the images not loading right away they may simply be loading the images with the onload script command. which means that even if you don't see the image your browser loads it into memory the instant the page loads, or tries to depending on how long it takes to download a temporary copy of every image on the web page or site depending on how the code is written... something to think about what are two most visited sites...

1, google
2, facebook
3, youtube
4, yahoo
5, baidu
6, amazon
7, wikipedia
8, taobao
9, twitter
10, qq

that says something... two of those are search engines, several are social media, one is a bookstore/whatever else then can sell, I'm not actually sure what some are an not sure I want to look at them in case I see something like a rick roll... I know I use google and another site as what I open my browser windows to by default so does everyone who uses the default Mozilla and chrome browsers... but really the sites I do recognize on their are all image heavy because that is what people are looking for on those sites... well the ones i know what are. adobe refuses to use lazcos so considering most people use that or gimp and adobe products are easier to use, so that likely accounts for a lot of bloated images... every time people ask them about correctly adding the png code to their software they said it is good enough and if you don't like it don't use it. So we all just have to keep using broadband... most artists will work in photoshop and use another program to compress the images but the things like dream weaver and so forth are all built to connect using the adobe bridge so you can either build the site then re batch all the images afterwards or code more of it by hand.

though doom came out when we had dos... and two years later they were making jokes about 640K limit that dos had in 1988... and the hard drive for the compaq in 1983 when 10 years before doom shipped held 10 megabytes... or roughly four times the size of one game... makes you laugh when people complain how much space games take up on our hard drives these days... likely it is perspective thing, kinda like the fact on that same old newspaper you find that apple was the first company to copyright software... and won a case against some company named franklin computer... likely a apple clone like the pc clones...
my first computer I got for school I needed to work in autocad for homework, had maybe a 500 megabyte drive right around this same time frame. I remember playing over dwago on the week ends because I was only supposed to use it for class work during the week not that anyone ever checked.

Something to think about is that the current clients (it is in testing I can't remember if we are allowed to mention it) is more about helping you visualize what is built along the same lines... back then you had blue prints and you had to scan in images and look at them on another window or second monitor like I did which was a pain since most computers only had one vga port unless you added a graphics accelerator... the funny thing is most people these days assume that everything was 8 bit or higher... I remember playing on monochrome green glowing letters and askii art, the cga and later color doom ran on cga with 320×200 pixel mode with four colors if I remember correctly... today I'm working on a 12 bit lut table monitor and their are 14 bit lut monitors...
 
the funny thing is most people these days assume that everything was 8 bit or higher... I remember playing on monochrome green glowing letters and askii art, the cga and later color doom ran on cga with 320×200 pixel mode with four colors if I remember correctly... today I'm working on a 12 bit lut table monitor and their are 14 bit lut monitors...

Doom ran on MCGA -- which was one of the new graphics modes that IBM introduced along with VGA on the PS/2 (and was one of the few extended modes that were actually present on the PS/2 Model 25 and 30).

320x200x256 color using an 18 bit color lookup table (clut) -- 6 bits per color, 262144 colors possible.

And your monitors aren't 12 bit lut based -- they are 12 bit per color, though most RGB work is still 8:8:8, though you can get potentially more colors and more accurate reproductions if you shift over to a YUV color space.
 
1, google
2, facebook
3, youtube
4, yahoo
5, baidu

How did you get on Baidu? You are talking about the file sharing site? You have to have a Chinese phone number, is there a way around that? I have been wanting to get on that site for about a year, but haven't see a way to do it.
 
grin where in the world did you find a mcga card? an old model 25 or model 30? I'm using a work station monitor not a two dollar tv or gaming monitor so yes I have an actual lut built into the monitor like many people who need to know what the colors are supposed to be even in the display can not display them... nice try though... cga is 16 color mode, ega is 256 color, and vga is 16 million colors... 8 bit gaming is called that because the old web comic that came about right around 2002 or so nearly a decade after most of us had gotten rid of the old consoles and computers...

back then we had 16 colors, 256 colors, 16 million colors, 24 million colors, and 24 million colors and 8 bit alpha channel or 32 million colors listed on the displays... I'm pretty sure some of that stuff is on display in the smithsonian... I'm sorry you missed the old gaming days the first time they were here but most games really do look better theses days.

snicker here is what you are thinking about... doom came out in 1993 but back then most people did not have two grand to drop on a kids computer so adults made do with commodore 64 and vic 20's I had a vga capable system for auto cad and that system cost over two thousand dollars some of the nice ones twelve thousand dollars my first car I bought used for two grand was 1981 320i video games were targeted at the systems people had not the ones they would have liked to have. Well that and we were playing over dial up connections that cost per minute I remember getting grounded for a month for making what I thought was a long distance phone call that ended up costing five hundred dollars...

Multicolor Graphics Adapter (MCGA)


MCGA shipped first with the IBM PS/2 Model 25 in 1987. MCGA graphics were built into the motherboard of the computer. As a sort of step between EGA and VGA, MCGA had a short life and was shipped with only two IBM models, the PS/2 Model 25 and PS/2 Model 30 and fully discontinued by 1992. The MCGA capabilities were incorporated into VGA. Note: Some say that the 256-color mode of VGA is MCGA but, to be accurate, no MCGA cards were ever made; only the two IBM PS/2 models indicated had true MCGA chips. The 256-color mode of VGA, while similar, stands alone as part of the VGA specification.


The specific MCGA display modes included:


  • All CGA modes (except the text mode that allowed connection of the MDA (model 5151) monitor).
  • 640×480 monochrome at a 60 Hz refresh rate.
  • 320×200 256-color at a 70 Hz refresh rate. The 256 colors were chosen from a palatte of 262,144 colors.

Like the other IBM standards, clone makers quickly cloned VGA. Indeed, while IBM produced later graphics specifications as we’ll see below, the VGA specification was the last IBM standard that other manufacturers followed closely. Over time, as extensions to VGA appeared, they were loosely grouped under the name Super VGA.


 
How did you get on Baidu? You are talking about the file sharing site? You have to have a Chinese phone number, is there a way around that? I have been wanting to get on that site for about a year, but haven't see a way to do it.

Actually I just used the site they did to pull what the top ten site alexa gets data on... http://httparchive. org/ urls.php sorry for the link but it is easier than getting people to follow the link to the site then to site it refers to if you want it edited out mod the site link with hardocp one..
 
grin where in the world did you find a mcga card? an old model 25 or model 30? I'm using a work station monitor not a two dollar tv or gaming monitor so yes I have an actual lut built into the monitor like many people who need to know what the colors are supposed to be even in the display can not display them... nice try though... cga is 16 color mode, ega is 256 color, and vga is 16 million colors... 8 bit gaming is called that because the old web comic that came about right around 2002 or so nearly a decade after most of us had gotten rid of the old consoles and computers...

back then we had 16 colors, 256 colors, 16 million colors, 24 million colors, and 24 million colors and 8 bit alpha channel or 32 million colors listed on the displays... I'm pretty sure some of that stuff is on display in the smithsonian... I'm sorry you missed the old gaming days the first time they were here but most games really do look better theses days.

snicker here is what you are thinking about... doom came out in 1993 but back then most people did not have two grand to drop on a kids computer so adults made do with commodore 64 and vic 20's I had a vga capable system for auto cad and that system cost over two thousand dollars some of the nice ones twelve thousand dollars my first car I bought used for two grand was 1981 320i video games were targeted at the systems people had not the ones they would have liked to have. Well that and we were playing over dial up connections that cost per minute I remember getting grounded for a month for making what I thought was a long distance phone call that ended up costing five hundred dollars...

Multicolor Graphics Adapter (MCGA)


MCGA shipped first with the IBM PS/2 Model 25 in 1987. MCGA graphics were built into the motherboard of the computer. As a sort of step between EGA and VGA, MCGA had a short life and was shipped with only two IBM models, the PS/2 Model 25 and PS/2 Model 30 and fully discontinued by 1992. The MCGA capabilities were incorporated into VGA. Note: Some say that the 256-color mode of VGA is MCGA but, to be accurate, no MCGA cards were ever made; only the two IBM PS/2 models indicated had true MCGA chips. The 256-color mode of VGA, while similar, stands alone as part of the VGA specification.


The specific MCGA display modes included:


  • All CGA modes (except the text mode that allowed connection of the MDA (model 5151) monitor).
  • 640×480 monochrome at a 60 Hz refresh rate.
  • 320×200 256-color at a 70 Hz refresh rate. The 256 colors were chosen from a palatte of 262,144 colors.

Like the other IBM standards, clone makers quickly cloned VGA. Indeed, while IBM produced later graphics specifications as we’ll see below, the VGA specification was the last IBM standard that other manufacturers followed closely. Over time, as extensions to VGA appeared, they were loosely grouped under the name Super VGA.

"MCGA mode" (mode 13h) -- 320x200x256 is what Doom ran in and was a subset of the standard modes in all of the VGA capable PS/2 line, it's just that the Model 25/30 ONLY had the MCGA adapter with that mode and an additional pseudo-EGA mode. On this, I am WAY too familiar, as I was an in-store IBM Retail Sales Rep in 1988/1989 and had to try to sell the bloody things to people, as opposed to the Packard Bell units sitting one aisle over that had real VGA cards in them, 286 processors, and sold for half the price (a job I quit when they cut our pay to minimum wage and put us on commission, "to encourage us to sell more of the IBM's," which we could essentially only do by lying to the customers).

I've had PC compatibles since I built my first NEC V20 powered Turbo-XT circa 1986 with a Hercules graphics adapter (before that, I had already run the gamut of Apple ][, //e, //c, and //gs).

I also spent a bit of time around then playing with the original 8514A on, if memory serves, a PS/2 Model 60, and later on a TMS340 series TI based 8514 adapter.

It was also around this time that you started to see the first truly decent clone VGA chips start coming out -- specifically the Tseng 4000 and Trident 8800/8900 series. This was the point at which the ability to display 800x600 or 1024x768 became relatively common, and most of these "Super VGA" modes maintained the 8-bit per pixel (256 CLUT) scheme for color definition (i.e. 256 colors maximum, typically using a 16 bit 5:5:5/5:6:5 RGB index table [32K/64K "Hi-Color"] which was quickly superceded by the 8:8:8 "True Color" modes). This generation of cards, especially the ET4000, I know intimately (and still have the full manufacturer data sheets and programming specifications sitting on my shelf -- which were insanely difficult to get at that time, but I somehow managed to talk one of Tseng Lab's engineers at Comdex-Atlanta into giving me a copy) as I wrote and submitted the code and timings that XFree86 used for many of the "tweaked/enhanced" timing modes for these cards 832x624, etc. as well as some other custom timing modes designed to work with high resolution monitors that originally came with Sun/360 workstations. I fondly remember the days when I got in my first HiColor DAC's for whatever video card I had at that time (Trident 8800, I think) and I also remember playing around with the old, weird Edsun DAC that did internal color interpolation.

Also, the LUT you are talking about on your current monitor is the LUT that is used to map the incoming video signal into index values for display on the LCD panel itself -- which is used for color correction, gamma correction, etc to gain better color reproduction. And, you are correct, in that on higher end monitors you may actually have 14 or 16 bit LUTs, which allows the incoming signal from the computer (which is still typically 8:8:8 most of the time, though over HDMI and/or Displayport most newer card can support 10:10:10 or 12:12:12) to be corrected to a level approximately 4x or 8x as accurate as what is coming in from the source (i.e. so you can color match red to an actual panel adjustment level of say 12.25 vs the 12 that was coming in from the computer). The problem is that trying to get anything higher than 8:8:8 out of Windows (except for video playback through the scalar) can often be a royal pain in the arse -- though with Quadro's/FirePro's, the right drivers, and software that supports it, it's doable.

And, I played Doom HEAVILY when it first came out, and even ran 10-base/2 throughout the house to all the computers so that we could play it networked multiplayer. Ah yes, the good old days of NE2000's and having to get IPX/SPX working properly.
 
Last edited:
And, I played Doom HEAVILY when it first came out, and even ran 10-base/2 throughout the house to all the computers so that we could play it networked multiplayer. Ah yes, the good old days of NE2000's and having to get IPX/SPX working properly.

And bringing the network to a standstill by firing the chaingun. :)
 
Indeed, I'm not impressed, nor am I 'missing something' here. I think the author is missing a few things.

Graphics for Doom were impressive for its time, but today, would be a postage stamp on any monitor made in the past 10 years. Doom originally ran on VGA mode 13h, which was 320 x 200 x 256 colors. This resolution and color space were positively tiny compared to modern images.

Doom was an assembly programmed executable (assuming that's what they meant by install image). Modern web sites are written in human-friendly HTML and XML, which is, of course, a whole lot less compact than optimized assembly language.

Don't get me wrong. Doom was an awesome game for its time, and pretty innovative and optimized, all things told. But Doom came out in 1993, when the state of the art computer had an awesomely amazing 8 MEGAbytes of RAM. We've come a long way from 1993: Many modern CPUs have 8MB of _cache_. Times change, and while Doom was impressive, next to Halo, Call of Duty, or even possibly Doom's modern reboot, Doom (the 1993 version) is rather dated. Dooms game play window can fit into a modern HUD/preview window. Doom's AI is eclipsed by anything that runs today. Doom's graphics themselves are not even truly 3D, but instead, a pseudo-3D that was revolutionary for its time.

I keep hearing from oldies that "oh, computers were so much more efficient back then!' From a guy whose first computer was an IBM XT, and whose first real computer was a 386DX/40? You can keep your old 'efficient' games. I'm gonna stick with my Star Trek Online and Masters of Orion: Conquer the Stars. ;)
 
My first machine was either a pong console or C64 don't remember which it was a long time ago but in 1993 it was either the 25 Mhz 8086 that turbod to fifty or 100 Mhz that got replaced by a 90Mhz machine but either way the machine had a single cirrus logic chip, that ran the cga ega and vga supported by the vesa monitor but for my machine which was not an ibm but as they were called back then ibm clones, and did not have a mcga resoltion or mode... mode 13 is vesa mode where they spilt the screen into four parts which could each draw one color pixel to one memory space still limited it to sixteen colors at 320 200 at 256 color pallete so you could make up a optimized ocetree I always wonder what that from, which gave you 256 divided by four pages or 16 colors... you can count them in the game but likely without asking Michael Abrash or John Carmack, of id software who was a teen ager when he wrote doom in his garage on what type of hardware?, what mode it ran it or the colors but the most of the clones as you said where far cheaper and faster than the business machines... so mode x or mode 13 the game only had 16 colors and they save some colors for the bosses so they stood out more. I did not work on the game but years ago when gaming was less stab you in the back John used to put up tech docs you could read through on how the engines where built and games, I actually taught myself quite few things about lighting from reading some of those papers then applying in a completing product unreal engine and the alisa wave front ple... before it was a ple.

Short version mode 13 is sixteen colors stored in memory and 256 colors as a possible color choice same as ega... same resolution... so I have no idea what my screen said back then at this point but I can count the colors.

not sure I want to admit where this came from since people seem to get attacked that admit they played games back then that were not eight bit...

1. INTRODUCTION TO THE VGA AND ITS 256-COLOR MODE

Since its first appearance on the motherboards of the IBM PS/2 50, 60
and 80 models in 1987, the Video Graphics Array has been the de facto
standard piece of graphics hardware for IBM and compatible personal
computers. The abbreviation, VGA, was to most people synonymous with
acceptable resolution (640x480 pixels), and a stunning rainbow of colors
(256 from a palette of 262,144), at least compared to the rather gory
CGA and EGA cards.

Sadly, to use 256 colors, the VGA BIOS limited the users to 320x200
pixels, i.e. the well-known mode 13h. This mode has one good and one
bad asset. The good one is that each one of the 64,000 pixels is easily
addressable in the 64 Kb video memory segment at 0A000h. Simply calculate
the offset using this formula:

offset = (y * 320) + x;

Set the byte at this address (0A000h:eek:ffset) to the color you want, and
the pixel is there. Reading a pixel is just as simple: just read the
corresponding byte. This was heaven, compared to the havoc of planes and
masking registers needed in 16-color modes. Suddenly, the distance from a
graphics algorithm on paper to an implemented graphics routine in assembly
was cut down to a fraction. The results were impressively fast, too!

The bad asset is that mode 13h is also limited to only one page, i.e.
the VGA can hold only one screenful at any one time (plus 1536 pixels, or
about four lines). Most 16-color modes let the VGA hold more than one page,
and this enables you to show one of the pages to the user, while drawing on
another page in the meantime. Page flipping is an important concept in making
flicker free animations. Nice looking and smooth scrolling is also almost
impossible in mode 13h using plain VGA hardware.

Now, the alert reader might say: "Hold on a minute! If mode 13h enables
only one page, this means that there is memory for only one page. But I
know for a fact that all VGAs have at least 256 Kb RAM, and one 320x200
256-color page should consume only 320*200=64000 bytes, which is less
than 64 Kb. A standard VGA should room a little more than four 320x200
pages!" Quite correct, and to see how the BIOS puts this limitation on
mode 13h, I'll elaborate a little on the memory organization of the VGA.

The memory is separated into four bit planes. The reason for this stems
from the EGA, where graphics modes were 16-color. Using bit planes, the
designers chose to let each pixel on screen be addressable by a single
bit in a single byte in the video segment. Assuming the palette has
not been modified from the default, each plane represent one of the EGA
primary colors: red, green, blue and intensity. When modifying the bit
representing a pixel, the Write Plane Enable register is set to the
wanted color. Reading is more complex and slower, since you can
only read from a single plane at a time, by setting the Read Plane
Select register. Now, since each address in the video segment can
access 8 pixels, and there are 64 Kb addresses, 8 * 65,536 = 524,288
16-color pixels can be accessed. In a 320x200 16-color mode, this makes
for about 8 (524,288/(320*200)) pages, in 640x480 you get nearly 2
(524,288/(640*480)) pages.

In a 256-color mode, the picture changes subtly. The designers decided
to fix the number of bit planes to 4, so extending the logic above to 8
planes and 256 colors does not work. Instead, one of their goals was to
make the 256-color mode as easily accessible as possible. Comparing the
8 pixels/address in 16-color modes to the 1-to-1 correspondence of
pixels and addresses of mode 13h, one can say that they have
succeeded, but at a certain cost. For reasons I am not aware of, the
designers came up with the following effective, but memory-wasting
scheme:

The address space of mode 13h is divided evenly across the four bit
planes. When an 8-bit color value is written to a 16-bit address in the
VGA segment, a bit plane is automatically selected by the 2 least
significant bits of the address. Then all 8 bits of the data is written
to the byte at the 16-bit address in the selected bitplane (have a look at
figure 1). Reading works exactly the same way. Since the bit planes are so
closely tied to the address, only every fourth byte in the video memory is
accessible, and 192 Kb of a 256 Kb VGA go to waste. Eliminating the
need to bother about planes sure is convenient and beneficial, but to
most people the loss of 3/4 of the total VGA memory sounds just hilarious.

To accomodate this new method of accessing video memory, the VGA
designers introduced a new configuration bit called Chain-4, which
resides as bit number 3 in index 4 of the Sequencer. In 16-color modes,
the default state for this bit is off (zero), and the VGA operates as
described earlier. In the VGA's standard 256-color mode, mode 13h, this
bit is turned on (set to one), and this turns the tieing of bit
planes and memory address on.

In this state, the bit planes are said to be chained together, thus mode
13h is often called a _chained mode_.

Note that Chain-4 in itself is not enough to set a 256-color mode -
there are other registers which deals with the other subtle changes in
nature from 16 to 256 colors. But, as we now will base our work with
mode X on mode 13h, which already is 256-color, we won't bother about
these for now.

if you promise to leave that poor sap out out it I will post a link to it..
 
Drakken, you are sorely mistaken on mode 13h -- it is the standard 256 color indexed 320x200x256 color indexed mode, which is what the article you pasted agrees with (and goes on to elaborate on the undocumented, so-called Mode X 320x240 which also enabled planar memory mode).

Mode 13h allowed for 256 unique colors per pixel, which indexed a lookup table of 6 bits per pixel (i.e. 262144 total colors).
You are totally wrong in that it only had 16 colors and was the same as EGA -- it had 256 REAL colors at any given time and 320x200 resolution.
Mode X simply tweaked the hardware a bit to get it into a 320x240 mode with square pixels, but the same indexed lookup table.

There were some techniques you could use to display static images that involved a timer interrupt sychronized to the horizontal blanking period at the end of each scan line, such that you would then reload the contents of the CLUT before the next scan line was generated. This allowed you to get 256 unique colors PER SCAN LINE rather than per screen.

When SuperVGA came about, the same pattern was followed with 800x600 and 1024x768 still using the 256 color CLUT mode -- and this was the mode Windows 3.1 generally utilized most often, with the bottom 16 colors of the CLUT remaining fixed to allow UI elements to remain consistent, with the higher CLUT entries being changed dynamically based on the optimal color combination on the screen.

It was not until the Sierra HiColor DAC that a true 2 byte per pixel (5:5:5 or 5:6:5) format became common (32K unique colors per screen). It operated by doubling the dot clock rate to allow for 16 bits of data in the same horizontal period as previously had been used for 8, and then disabled the color lookup table. That way the raw data was clocked into the DAC directly. I had both the Edsun DAC (which used per line hinting to generate interpolated colors on the fly) and the Sierra DAC.

In terms of what computer you had in 1993, I have no clue, but it sounds pretty dated. By the tail end of 1993/first of 1994, I was running an AMD 486DX2-50 at home and a dual 66Mhz Pentium at work (on NT). As far as the monitor I had then, I *think* it was a Sony Trinitron 17", which I replaced shortly thereafter with a Philips 20CM64 20" (that weighed almost 80#).

And, frighteningly enough, the keyboard I am typing this on is still the same Philips P2813 series 101 key keyboard that I have had since 1992 (it came with a Magnavox 286 desktop I purchased at a company clearance sale -- and stripped for parts). It's connected through a PC/AT to PS/2 adapter to my Cybex Switchview KVM and then through an IoGear PS/2 to USB adapter. It still works great though even after almost 25 years and even the letters haven't worn off yet (other than maybe half the A key).

As far as your Cirrus Logic chip not having MCGA modes, then it had to be one of the REALLY early VEGA series (or maybe Eagle II, I'd have to check) that were really sort of "Enhanced EGA" designs, but sold as VGA. And if you have any questions on this, I can look it up, as I have almost as many Cirrus Logic datasheets as I do Tseng or Trident, though they tend to be more heavily weighted toward the 5425-5434 era (as, in 1995/1996, I was actually involved in meetings with Cirrus Logic at their headquarters, discussing the use of their cards for the Philips "PCTV" project, and I was receiving engineering sample cards direct from them along with internal/confidential engineering datasheets).

That said, if you played the original Doom on said PC, then it was definitely one of the later models that DID support 320x200x256 color mode, because that was the ONLY mode that the original Doom even supported (ditto for Raven's Heretic).

And, I played a LOT of games in this era -- starting with Wolfenstein 3D (pre-Doom), Doom, Heretic, Hexen, Duke 3D, Quake, etc. It was with Quake and the advent of cards like the 3DFx Voodoo that REAL 3D games began to show up.
 
Back
Top