3DMark06 - Two days time

Sc4freak said:
Oh, yes, that explains why the CPU score is now included in the overall score in '06. :rolleyes:

According to Rage3d, the 4 graphics tests are actually not dependant on the cpu at all really and in fact are programmed to not use much cpu power.

Of course there are 2 cpu tests and they are factored into the score, but they are pretty much
stand alone and don't affect the the sm2 sm3 tests.

So if you took a specific system and swapped out the cpu from like a 2.2ghz newcastle to an x2 at 2.7ghz, of course the cpu score will change dramatically, but supposedly you'll barely see a difference in the 4 graphics tests.
 
Visually nothing particuarly new, HDR isn't anything special anymore, neither are most of the other effects, it still remains totaly useless at comapring different hardware. The game is rubbish, and the CPU benchmarks take far too long to run even on a 4000+ @ FX-55 speeds.

I'm glad I downloaded it overnight, had I interupted other things to download nearly 600Mb i'd have probably been quite anoyed.

Still reviewes are going to add it in benchmarks of cards for "completeness" and so the hardware vendors are going to be at each other throats again, optimisations left right and centre and its going to take me 2 extra hours to find drivers which dont run 3dmark06 20% faster but normal games 20% slower.
 
101 said:
I like how they used the almost square garbage resolution of 1280x1024 as the default.
Yeah, what's up with that?!
Probably why my score is so low = 911 with sig system. :(


DougLite said:
I would imagine that the pervasiveness of 17 and 19 inch LCDs that run at a native resolution of 1280x1024 motivated that decision, along with a need to run at a higher resolution to make it less CPU limited and more bound to GPU(s) performance.
Makes sense, but sux for those of us like myself who have monitors which don't like that res.
 
I also noticed the native resolution of the benchmark was 1280 1024, not a step in the right direction IMO, most people with high end video cards (which you seem to need to run this) aren't likely to spend money on LCD monitor of which most only support tiny resolutions.
 
Well, the reactions are the same with every new 3DMark release, aren't they? :)
Or well, not really...
With 3DMark03, people were bitching because the NVIDIA FX series got creamed.
With 3DMark05, people were bitching because NVIDIA cards used some features that ATi didn't support.
With 3DMark06, people are bitching because the ATi X1800 series gets creamed, and because NVIDIA cards use some features that ATi doesn't support.

And why the bitching? Because it wouldn't be comparable to reality (aka games)?
Thing is, the games are the synthetic ones, 3DMark shows what your hardware is REALLY capable of.
Cards need workarounds for the features they don't support.
In games, well either the features are just disabled, or the drivers are erm 'optimized'. So although the framerates may be more comparable, the actual rendering is not. And for some reason, reviews/benchmarks often fail to mention that. I think it's important to know about both performance and render quality/features when comparing hardware. Just a raw framerate doesn't mean anything if you don't know what exactly is being rendered.
 
The difference is that 3DMark05 at least showed where each card rougly stood in regards to performance in games. With 3DMark06, that seems to have been completely erased.

Since when does a 7800GTX 256mb consistantly beat the X1800XT 512mb all the time in games?
 
Sc4freak said:
The difference is that 3DMark05 at least showed where each card rougly stood in regards to performance in games. With 3DMark06, that seems to have been completely erased.

It shows what would happen when games start using HDR/FP16 blending extensively.

Sc4freak said:
Since when does a 7800GTX 256mb consistantly beat the X1800XT 512mb all the time in games?

Since games use HDR/FP16?
The only game I know of atm is Far Cry... and I don't think it can use HDR on an ATi card at all. So well, perhaps the ATi cards aren't slower in Far Cry, but they sure look a lot different when you enable HDR.

ATi is getting punished for not putting in FP16 blending support, and rightly so. NVIDIA has supported it since the 6x00 series.
 
what about the Source engines HDR?
Remember, when saying "HDR", whos HDR are you talking about?HDR in FEAR and FAR CRY and Source are not one in the same.......apples to oranges....


Scali, how can you believe that? 3dmark does nothing to address the AI aspect of gameplay...let alone test for multiplayer aspects......and what about sound? if they are testing the SYSTEM, why dont they use sound? sound can have an impact on gameplay too......real ingame (as you would play it with sound and AI and such)performance will beat any benchmark program in determining how good of a system you have.
 
Scali,

the x1800xt serries works fine with HDR in Frycry. Just you need a patch do to the way Crytex impemented it. There have been users here posting screen shots of it working.

The biggest problem is that FM has allowed the freedom of using non-standard DX features and has strayed away from being a true DX9 benchmark. And their choices on how and why each features seems to be inconstant. Read the thoughs about it from Hexus, B3D and ET and as they point out some really bizare design choices that all seem to favor one HIV. Had they stuck to a pure DX9 path, none of this would have been an issue...
 
Ace_McGirk said:
what about the Source engines HDR?
Remember, when saying "HDR", whos HDR are you talking about?HDR in FEAR and FAR CRY and Source are not one in the same.......apples to oranges....

That's right, Source uses a HDR-ish effect with reduced quality on ATi cards. It works, framerates are okay, but again you are not rendering the same on both cards.

Ace_McGirk said:
Scali, how can you believe that? 3dmark does nothing to address the AI aspect of gameplay...let alone test for multiplayer aspects......and what about sound? if they are testing the SYSTEM, why dont they use sound? sound can have an impact on gameplay too......real ingame (as you would play it with sound and AI and such)performance will beat any benchmark program in determining how good of a system you have.

That's a different story, I thought we were talking about how graphics cards relate to eachother in games and this benchmark.
Your question is much like "Why doesn't my car racing game get the same performance on my system as my multiplayer strategy game?"
The answer is simple: It's not the same type of game.
 
Jbirney said:
The biggest problem is that FM has allowed the freedom of using non-standard DX features and has strayed away from being a true DX9 benchmark.

Yes, and? Games like Far Cry also use these non-standard DX features. What's more important for a gamers benchmark? Sticking to 'true DX9', or using rendering methods comparable to the ones used in actual games?

Jbirney said:
And their choices on how and why each features seems to be inconstant.

Erm, so? Futuremark is just a company. Company policies and strategies change over the years. The industry is dynamic, and so should your company be.
It's like the constant nagging over Intel moving from extreme clockspeeds to extreme parallelism.

Jbirney said:
Read the thoughs about it from Hexus, B3D and ET and as they point out some really bizare design choices that all seem to favor one HIV. Had they stuck to a pure DX9 path, none of this would have been an issue...

You mean uninformed thoughts from self-proclaimed internet journalists who don't have the actual knowledge or experience regarding game/graphics technology? I don't value those thoughts very highly. First of all they ignore that these extensions were only supported by 1 IHV at the time, but this is no longer the case. Futuremark knew this at the time of their decision, so they were making a decision based on the expected support of the functionality, not because one IHV happened to support it at the time. With 3DMark03, the same happened... SM2.0 was only supported by ATi at the time. But apparently that was okay, because it was 'pure DX9'. Well, in the real world, that doesn't matter at all. Many popular games use these extensions, so it is only representative that Futuremark chose to support them aswell. And it is a good thing that they changed their policy in order to follow the industry. They think practical.
 
Ace_McGirk said:
Scali, how can you believe that? 3dmark does nothing to address the AI aspect of gameplay...let alone test for multiplayer aspects......and what about sound? if they are testing the SYSTEM, why dont they use sound? sound can have an impact on gameplay too......real ingame (as you would play it with sound and AI and such)performance will beat any benchmark program in determining how good of a system you have.



That's right....I understand the reason some sites "test" with no sound, to eliminate that influence, but who plays games with NO SOUND? I certainly don't...I really do like the in game testing methodology....plus if two different cards are tested on the SAME SYSTEM including whatever sound card they use, then the results are STILL valid.....so why bench with no sound for the simple sake of a useless "real world" benchmark???


Kudos to the [H] for embracing a "real" benchmark method.....
 
Not that I REALLY care what this silly bench gets, but I get the below score with the cpu at 2.6ghz, card clocked at 493/1300.....seems low, but you know what....REAL games play just fine :p
Plus, I did not tweak anything in the control panel of the vid card, I just selected the profile then ran the default bench....


3dm06.jpg
 
I believe in real world performance ATi and nVidia are much closer to eachother. But if somone used this 3DMark06 to judge between them, they would be lead to believe that ATi couldn't come close to the competition.

Disclaimer: This isn't a pro ATi or anti-nVidia post; As you can see from the list, I use nVidia in SLI. I am just pointing out the shortcomings of this program as i see them.
 
DocFaustus said:
I believe in real world performance ATi and nVidia are much closer to eachother.

Depends on what you consider the 'real world'.
Is the real world a world where both cards render virtually the same scenes, with virtually the same image quality?
Or is the real world a world where the slower cards get simpler scenes and lower quality, so the actual framerates are virtually the same?
 
Brent_Justice said:
The real world is games, and the gaming experience they provide.

It's really that simple.
Yup!

And for the poster above questioning me. I will quickly state how I set my games: I set game to the desired resolution (1600x1200) and then increase all the ingame settings until I reach a point where my my framerate is no longer acceptable, then back off just a little. If I max out all of the ingame settings and my framrate is still fine, then I go out of the game and manually adjust the AA as much as I can (AF is always maxed out).
 
Brent_Justice said:
The real world is games, and the gaming experience they provide.

It's really that simple.

No it's not, because you use the term "gaming experience" for the exact same thing that I was asking. It's a useless statement. Like "What card would you buy?" - "The best one".
Obviously, but what is "best"?

"Is the real world a world where both cards render virtually the same scenes, with virtually the same image quality?
Or is the real world a world where the slower cards get simpler scenes and lower quality, so the actual framerates are virtually the same?"

What is the best gaming-experience?
For me it would primarily be the best possible image quality, and the framerate would be secondary. The best possible quality at the best possible framerate. Which means NVIDIA for me, this round, just like 3DMark06 shows in its score.
 
DocFaustus said:
And for the poster above questioning me. I will quickly state how I set my games: I set game to the desired resolution (1600x1200) and then increase all the ingame settings until I reach a point where my my framerate is no longer acceptable, then back off just a little. If I max out all of the ingame settings and my framrate is still fine, then I go out of the game and manually adjust the AA as much as I can (AF is always maxed out).

It sounds like you never even considered the possibility that the same settings may not result in the same quality on all videocards.
Knowing that one card looks considerably better than another, but has slightly lower framerates (say 75 vs 70 fps), would you still just pick the card that gets the highest framerate at maximum settings?
 
I think you are over thinking things Scali.

It is quite simple, whatever card allows the highest image quality settings at the highest framerates is the better card.

Example, let's say in FEAR card A allows you play at 1280x1024 at 2XAA/8XAF with shadows off, dynamic lights turned down and particles turned down.

Now let's say Card B allows you to play at 1600x1200 4XAA/16XAF with everything on, shadows, lights, particles.

Obviously Card B is allowing a more immersing gaming experience.

That's an extreme example but it shows how simple it is.

And sometimes, cards allow an equal IQ at a similar performance level, in that case either one would be fine.
 
Scali said:
It sounds like you never even considered the possibility that the same settings may not result in the same quality on all videocards.
Knowing that one card looks considerably better than another, but has slightly lower framerates (say 75 vs 70 fps), would you still just pick the card that gets the highest framerate at maximum settings?
But the 3dMark score you get doesnt tell you what one looks better, hell it doesn't even tell if you if you that the screen is being drawn properly or if you card is artifacting. All these important checks are being done with your eye, not with the program. So why use an arbitrary program that doesn't test everything to say how it is going to work in all games? Why not go straight to the game?

3DMark is a neat little demo and can be used as a tool to help determine max overclock when comparing a system to itself. When you start comparing cross platforms the test comes up serverly lacking.
 
Scali said:
Yes, and? Games like Far Cry also use these non-standard DX features. What's more important for a gamers benchmark? Sticking to 'true DX9', or using rendering methods comparable to the ones used in actual games?

Yes and you bring up a good point. Developers like Crytex added many more features for both ATI and NV to their games. But 3DMark2k6 did not all them all. Or only added a small subset that THEY thought was fair. Also many devleopers will use different paths for NV and ATI. So shouldn't 3Dmark then do the same? I guess what I was trying to get it is 3Dmark should have stuck to DX9 only, or added a majority of features from both sides and thus have two different paths like most games will have.


Scali said:
You mean uninformed thoughts from self-proclaimed internet journalists who don't have the actual knowledge or experience regarding game/graphics technology? I don't value those thoughts very highly.

So we should trust you a Jaded developer who was kick out of their fourms because??? :) Point is they have info thats worth being added as do you. I could care less what you think of them as that is does not matter to this topic. No offense ment btw.


Scali said:
Many popular games use these extensions, so it is only representative that Futuremark chose to support them aswell. And it is a good thing that they changed their policy in order to follow the industry. They think practical.

The problem is that these games also use simular features and or work arounds in both IHVs. Yet FM seem to make choices that ignore the features or work around ATI cards will need and we know most deveopers will not do that... Thus you get some confusing data...
 
Brent_Justice said:
Obviously Card B is allowing a more immersing gaming experience.

Yes, but did you ever notice that most people don't even seem to realize that there are different effects and things on different cards?
With 3DMark you get the other side of the story. What if the cards would render virtually the same scene? I think this is far more interesting info than some framerates of games in a review, with no attention spent on which card actually renders what, and how.
That only tells me that the game programmers did a nice job in evening out performance differences.
 
DocFaustus said:
But the 3dMark score you get doesnt tell you what one looks better

No, but it tries to render as equally as possible on all hardware. So you can assume they look pretty much the same, apart from bugs in hardware and software.

DocFaustus said:
hell it doesn't even tell if you if you that the screen is being drawn properly or if you card is artifacting.

No, but I don't think that is 3DMark's job. If things can't be drawn properly, or have artifacts, then that problem will exist in all software, and should come up in things like the MS WHQL test.

DocFaustus said:
All these important checks are being done with your eye, not with the program. So why use an arbitrary program that doesn't test everything to say how it is going to work in all games? Why not go straight to the game?

Who says I'm interested in the game? I'm interested in the capabilities of the hardware. I want to decide which card will serve me best in the near future. Running current games won't tell me how future games will run. The games I will actually be playing with that card. 3DMark06 can do that for me.

DocFaustus said:
3DMark is a neat little demo and can be used as a tool to help determine max overclock when comparing a system to itself. When you start comparing cross platforms the test comes up serverly lacking.

Looks more like you don't know how to use the tool.
 
Scali said:
What if the cards would render virtually the same scene? I think this is far more interesting info than some framerates of games in a review, with no attention spent on which card actually renders what, and how.
That only tells me that the game programmers did a nice job in evening out performance differences.

That's not how things are with games.
 
Jbirney said:
Yes and you bring up a good point. Developers like Crytex added many more features for both ATI and NV to their games. But 3DMark2k6 did not all them all. Or only added a small subset that THEY thought was fair. Also many devleopers will use different paths for NV and ATI. So shouldn't 3Dmark then do the same? I guess what I was trying to get it is 3Dmark should have stuck to DX9 only, or added a majority of features from both sides and thus have two different paths like most games will have.

You will have to quantify this. What features did Crytek use that 3DMark didn't use, and why should 3DMark be using them?


Jbirney said:
So we should trust you a Jaded developer who was kick out of their fourms because??? :) Point is they have info thats worth being added as do you. I could care less what you think of them as that is does not matter to this topic. No offense ment btw.

Oh please. This is not at all about B3D, and it is very sad of you to even bring that up. This is about reviewers everywhere not having a clue about current hardware and software. It's just over their heads. Which would be fine if they realized it, but they don't. So they give out all kinds of misinformation.

Jbirney said:
The problem is that these games also use simular features and or work arounds in both IHVs. Yet FM seem to make choices that ignore the features or work around ATI cards will need and we know most deveopers will not do that... Thus you get some confusing data...

Erm, wait a second... what the heck are you on about?
Wasn't the whole deal BECAUSE of the workarounds? ATi is slower because it uses a workaround, which will give similar quality.
I think it's much more confusing if a workaround is used that renders at much reduced quality, or the effect is dropped altogether. What exactly are you comparing then?
Might aswell compare against a GeForce2 card that renders the whole scene without any shading whatsoever.
 
Is there a video demo available for those of us not inclined to install benchmark? :D
 
CoW]8(0) said:
Is there a video demo available for those of us not inclined to install benchmark? :D
If you want to play the '06 demo but dont want to install the program here is what you do:

1) Launch '05 (install if needed)
2) Turn the Contrast on your monitor way up, Increase the brighness a bit too....
3) Click the Demo button and you will get a good idea of what '06 looks like.
 
Thats what I was asking for before it was released. I stilll havent seen the new demos cause it requires sm3. =P

Most of it was existing demos from 3dmark05 running different shaders / textures etc.
 
That stuff from 05 in 06 tests taht 2.0 stuff.... not 3.0 It is a lot toughter and more detail than 05's... i scored 5000 something with 7800GT SLi
 
I think I am going to not install or run this bench... I am sick of benching... it's like putting your car on a dyno...... oooo yay, another car analogy. lol. *sigh* anyways..
 
CannibalTrout said:
Offtopic:

TheRapture, where did you get that wallpaper?

Someone in the forums had a screen capture from Serenity in HD widescreen...so it is now one of my widescreen wallpapers...
:p
 
Eh I'm not really liking this test all that much. It seems this test is even worse for real world performance than other 3dmark tests.

Example

A person with dual 7800GTX 512's and the same CPU as me (same clock speed) only gets 800 more points than I do even though they get 10fps or more (double FPS) in each of the video tests. A person with the same card as I have (or even less for that matter) that gets the same fps or less in the video tests yet has dual core (and scores .50 vs .30 in the cpu test) gets a higher score than the person with the dual 7800GTX 512's. In real world gaming the dual 7800GTX 512's with a single core would annihilate a regular 7800GTX with dual core. Meh, I can see why people say not to trust 3dmark for real world performance. It's painfully obvious in 3dmark06.
 
Scali said:
You will have to quantify this. What features did Crytek use that 3DMark didn't use, and why should 3DMark be using them?

Ok my falt. Crytek may have been a bad example vrs 3Dmark. But we both know that Crytek is more than happy to add features to their games to support each IHV.


Scali said:
Oh please. This is not at all about B3D, and it is very sad of you to even bring that up. This is about reviewers everywhere not having a clue about current hardware and software. It's just over their heads. Which would be fine if they realized it, but they don't. So they give out all kinds of misinformation.


I am sorry that I did, just that since then you have had a VERY LARGE chip on your sholders that seems to have negitivly impacted your veiws on review sites (which it would do to anyone in that case). I do agree that many reviewers out there dont know the full detials on the hardware, engines, ect and some times the are way off. However there are a few others that have both the techincal background and experince to share some valid points. These few should not be lumped into the others.


My "beef" with 3Dmark is they now have made a benchmark that dose not seem to be a true representive of what future games will be doing. Had they stuck to a pure D3D bath or make two optimized paths for each IHV (as thats what developers will end up doing) then they could have done it "better". Only time will tell if they are right or wrong...
 
Jbirney said:
Ok my falt. Crytek may have been a bad example vrs 3Dmark. But we both know that Crytek is more than happy to add features to their games to support each IHV.

And so did Futuremark. Else 3DMark05 would not have had shadowmaps at all on ATi cards, and 3DMark06 would not have had HDR-lighting on ATi cards.
In fact, Futuremark did such a good job that the general consensus was that the ATi 'software-emulated' shadowmaps looked better than NVIDIA's hardware-filtered ones.

Jbirney said:
I am sorry that I did, just that since then you have had a VERY LARGE chip on your sholders that seems to have negitivly impacted your veiws on review sites (which it would do to anyone in that case). I do agree that many reviewers out there dont know the full detials on the hardware, engines, ect and some times the are way off. However there are a few others that have both the techincal background and experince to share some valid points. These few should not be lumped into the others.

If you had known me, you'd know that I've had the same view on review sites long before.
And yes, there are some positive exceptions aswell. But I don't want to get into that.

Jbirney said:
My "beef" with 3Dmark is they now have made a benchmark that dose not seem to be a true representive of what future games will be doing.

This is a bit silly. "A true representative of future games". You mean that your crystal ball is different than Futuremarks?

Jbirney said:
Had they stuck to a pure D3D bath or make two optimized paths for each IHV (as thats what developers will end up doing) then they could have done it "better". Only time will tell if they are right or wrong...

You don't get it, that is what they ARE doing. See my first point in this post.
Problem is, where do you draw the line? Games often go too far in making different renderpaths, which makes any comparison meaningless. Half-Life 2 is a good example of that. When comparing a GeForce FX against a Radeon 9x00 series, the performance may be similar, but the FX renders a DX8.1-path, so it is of much lower quality than the Radeon.
Forcing the FX to the DX9-path reveals the true nature of the FX-card: it is incapable of rendering a full game with DX9-features at competitive framerates.
So where do you draw the line when benchmarking? I personally want to know if the FX can render a DX9-game properly. I don't want to know how well it does with DX8.1, because that's not going to be the majority of the games I'll be playing when I buy a GeForce FX and plan to play the latest games on it for the next 2 years or so.
So I think Futuremark does the right thing. If they go as far as Valve went, then it's no longer a benchmark, but an exercise in trying to tune the renderpaths to get the same framerates, regardless of what they are actually rendering. Which is useless, because I can even plug my GF2 back into my PC and devise a benchmark where it gets the same framerates as an X1800XT. Does that say anything about the GF2?

Instead, you should try to render the scenes as similarly as the hardware will allow, so have a fallback for things like HDR and shadowmaps, if certain hardware doesn't support it. That's also how games work, except that their default options are not maximum quality on all hardware.
 
Back
Top