Just for fun: 3DMark 2001SE

merlin704

The Great Procrastinator
Joined
Oct 4, 2001
Messages
12,843
So just for fun, I decided to play around with 3DMark2001SE on some modern hardware. This is not my main system but it's one I built for retro-gaming (mid 90s to 2000s). It does very well for what it is. Its an Intel 11th Gen Nuc.

Specs:

Intel i5 1135G7 @ 2.4Ghz (Boost 4.2ghz)
32gb Crucial DDR4 3200Mhz
Intel Iris Xe Graphics
Samsung 1tb SSD

If you want to try it out on whatever system you have, here are the links you will need:

VOGONS Running 3dMark2001 patch thread:
https://www.vogons.org/viewtopic.php?t=51587

3DMark Legacy Benchmarks:
https://benchmarks.ul.com/legacy-benchmarks

If you want to try and compare with myself or others here, use the default settings to keep things equal.

Here are my results:

1711378507600.png
 
It's been a long time since I ran this. I still have a copy lying around. Here's what it does on my PC with the following specs:

AMD Ryzen 9 5950X
64GB DDR4 3600MHz
AMD Radeon RX 7900 XTX
1TB SSD (PCIe 4)

3dmark2001se.png


The FPS counter is amusingly stuck at 999 most of the time. It actually goes waaaay over that, sometimes enough to cause my GPU to whine. It dropped below 100fps a couple of times though. From your screenshot it was the same test that was slowest for you too. I wonder what the bottleneck there is?
 
It's been a long time since I ran this. I still have a copy lying around. Here's what it does on my PC with the following specs:

AMD Ryzen 9 5950X
64GB DDR4 3600MHz
AMD Radeon RX 7900 XTX
1TB SSD (PCIe 4)

View attachment 643693

The FPS counter is amusingly stuck at 999 most of the time. It actually goes waaaay over that, sometimes enough to cause my GPU to whine. It dropped below 100fps a couple of times though. From your screenshot it was the same test that was slowest for you too. I wonder what the bottleneck there is?

Thats quite interesting. I wonder if its just a limitation of the software itself?
 
One thing that I really wonder about, are the massive number of motherboards out there that have special BIOS options related to 3dmark 2001. Sometimes even referred to as an "enhancement", which leads me to believe that it's either really easy to fudge the numbers for this old benchmark, or, there is something about modern (or semi-modern) hardware that holds it back due to the benchmark being so old.

IMO, I think that "3dmark Vantage" is a better "old" benchmark. It's a DirectX 10 benchmark, but DirectX 11 is just a superset of DirectX 10, so it's actually still relevant in many ways since DirectX 11 is still fairly prevalent.

Couldn't "Show Details" because the file was "too large" lol
5800X3D + 4080 (rest of specs in sig)

3dmark2001se.jpg
 
Last edited:
One thing that I really wonder about, are the massive number of motherboards out there that have special BIOS options related to 3dmark 2001. Sometimes even referred to as an "enhancement", which leads me to believe that it's either really easy to fudge the numbers for this old benchmark, or, there is something about modern (or semi-modern) hardware that holds it back due to the benchmark being so old.

IMO, I think that "3dmark Vantage" is a better "old" benchmark. It's a DirectX 10 benchmark, but DirectX 11 is just a superset of DirectX 10, so it's actually still relevant in many ways since DirectX 11 is still fairly prevalent.
I don't think any of these synthetic benchmarks are to be taken serious, this was more just for fun.
 
One thing that I really wonder about, are the massive number of motherboards out there that have special BIOS options related to 3dmark 2001. Sometimes even referred to as an "enhancement", which leads me to believe that it's either really easy to fudge the numbers for this old benchmark, or, there is something about modern (or semi-modern) hardware that holds it back due to the benchmark being so old.
There's probably some cheesing going on (older 3DM versions were very easy to cheese higher results out of) but AFAIK it's mostly the hardware. Recent GPUs have atrocious DX8 performance and DX9 perf is not so great anymore either. I just checked HWBot and the current WR for 3DM 2001SE is with a GTX 980Ti which says a lot about how more recent architectures handle ancient DX versions.
 
Interesting that you got a higher score with a 4060 Ti than GotNoRice did with a 4080. That suggests that it's more influenced by CPU than GPU. Of course the extra cache of an X3D CPU won't help with something so old, so probably Intel CPUs are better here. Maybe someone with a 14900KS would like to have a go?
 
Interesting that you got a higher score with a 4060 Ti than GotNoRice did with a 4080. That suggests that it's more influenced by CPU than GPU. Of course the extra cache of an X3D CPU won't help with something so old, so probably Intel CPUs are better here. Maybe someone with a 14900KS would like to have a go?
I noticed that as well. I thought about running it on my 14700k box and maybe on a box with an AMD flavored gpu, just for giggles. If I have time tonight.
After looking at that again, the refresh rate is incorrect. Should be 165
 
Lol, I'm glad to see I'm not the only one who runs 3DMark2001SE when I upgrade for the lulz. It was the first benchmarking software I used, so it's always funny to see how things have changed, especially in areas where I remember it tanking performance on my old TNT2.

Interesting results in this thread for sure, thanks for posting!
 
Interesting that you got a higher score with a 4060 Ti than GotNoRice did with a 4080. That suggests that it's more influenced by CPU than GPU. Of course the extra cache of an X3D CPU won't help with something so old, so probably Intel CPUs are better here. Maybe someone with a 14900KS would like to have a go?
That's what I'd expect. 3DMark 2001 is almost certainly single threaded, and GPU throughput has gone up massively compared to the improvements we've seen in CPU single thread performance.

i9-10980/64GB DDR4-3600 quad channel/3090. No OC. And yeah, it thinks I have Celerons, like 2-3 plants worth (36).
3DMark2001.jpg

, like
 
that brought back some memories... can't believe it's been over 2 decades...

anyway, my workstation, play the occasional game... RX480 8GB, 64GB DDR4 and a pair of Xeon 2637v4 cpus on a supermicro board

3dm.png
 
Wow this takes me back, oh yeah my 4090 got the same score, maybe the 14900K has something to do with it

View attachment 645909

100% it’s the CPU. It looks like Intel has a significant advantage over AMD Ryzen in this software. My 7900XTX paired with my 5800x3D only managed 71409, which is far lower compared to people in this thread running Intel CPUs but slower GPUs.
 
There's probably some cheesing going on (older 3DM versions were very easy to cheese higher results out of) but AFAIK it's mostly the hardware. Recent GPUs have atrocious DX8 performance and DX9 perf is not so great anymore either. I just checked HWBot and the current WR for 3DM 2001SE is with a GTX 980Ti which says a lot about how more recent architectures handle ancient DX versions.
The way the graphics pipeline has evolved since 8.1 I don't think modern graphic cards even know what to do with a lot of those functions and calls anymore. They have to be translated into something the hardware can understand, and that increases render time relatively.
 
Back
Top