Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Honestly sometimes I think the Inquirer's articles are written by chimps. The grammer is so bad.
DESPITE WHAT people were claiming, the G70 will score better than the 7800
Intel_Hydralisk said:First line of the article... So I stopped reading it.
Erasmus354 said:They were referring to the rumor flying around that G70 is named the 7800 because it scores 7800 in 3dmark05. they were saying that the g70 will score higher than the 7800 it was rumored to score.
. We still don't know its clock speeds but we know the score.
defiant said:I dont know but it just does not make any sense to me for Nvidia and ATI to release new cards that effectively double the speed of their current generation cards. They have spent millions on R&D and production costs on the 6800 and X800 cards and they have not even been around for a year yet.
Effectively are they not canabalizing sales of those products by releasing cards that are twice as fast? So for example, when Nvidia and ATI released the GF4 and 9700, all subsequent cards were pretty much incremental speed bumps up until the X800 and 6800 which effectively doubled the performance of the previous generation of cards.
Now you have people saying that they are going to double performance again? Whilst this will make the X800 and 6800 cheaper, sales would be substantially less than if an incremental speed bump was released. Why you may ask? Price/performance ratios.
It is possible that NVIDIA recouped their R&D cause SLI really did take off more than alot of people thought it would. Their are alot of people out there that bought 2 instead of 1. That helped alot!Un4given said:The X8xx series cards are still based on R300. While mildly modified over time, there have been no major alterations and still the same core at heart. ATi has already recovered their R&D for this core.
I'm not sure about NV. ATi released the R300 core when NV was releasing the FX, then 6xxxx some time after that, so the 6xxx series have not been out nearly as long as the R300 core. For that reason I couldn't say whether NV has fully recouped their R&D costs for the 6xxx series.
trudude said:It is possible that NVIDIA recouped their R&D cause SLI really did take off more than alot of people thought it would. Their are alot of people out there that bought 2 instead of 1. That helped alot!
Well we can be pretty sure it's gonna kick HL2 in the arse, and probably Far Cry as well....my major question is if ATi will improve their openGL performance.|0b0 said:I'd be jumping for joy right now.....if I gave a damn about a benchmark What can it do in farcry @ 1600x1200 6xaa 16xaf...that's the info I'm looking for. =)
jebo_4jc said:Well we can be pretty sure it's gonna kick HL2 in the arse, and probably Far Cry as well....my major question is if ATi will improve their openGL performance.
That is true that not everyone went out and bought 2 6800 Ultras, but NVIDIA prolly makes just as much profit when people buy 2 6600GT's. I know at least a dozen of my close friends who bought dual 6600GT's. I also know 3 others besides myself that bought 6800GT's. There are alot more people out there than you think that went SLI with NVIDIA. Most of the friends of mine are not even hardcore gamers.FanATIc said:Define "alot"
You are simply not making any sence. High end cards are ALWAYS the minority in total production. This is true for just about anything. You make it sound like everyone in the neighborhood is running SLI. SLI's cards cost ranges from $400-$1000 if the prices are retail. Not only is that high end, that is not appealing. Infact i havent seen anything saying SLI was causing Nvidia's cards to sell especially well, what i have been seeing is that the Nforce 4 SLI boards are selling well. That says nothing since its basically the best and was for a time the only chipset combining AMD and PCI Express. Hell, I own one because it was my only choice! I wont be going SLI.
Performance has been improving....ATi is pretty much tied with nV in high-end D3.jebo_4jc said:Well we can be pretty sure it's gonna kick HL2 in the arse, and probably Far Cry as well....my major question is if ATi will improve their openGL performance.
They run at equal settings....is that not equal? Yes, the nV will run the higher FPS, but any higher settings and it becomes unplayable.{NG}Fidel said:I would not say tied but only a bit below now a days.
Even still both cards (6800U and X850XTPE) are good enough to run DOOM3 at insane Settings.
They run at equal settings....is that not equal? Yes, the nV will run the higher FPS, but any higher settings and it becomes unplayable.
Dude....those DNA drivers are 3rd party modded drivers. They are like 10 fps faster than the latest nvidia beta. 3rd party drivers don't count.trinibwoy said:
jebo_4jc said:Dude....those DNA drivers are 3rd party modded drivers. They are like 10 fps faster than the latest nvidia beta. 3rd party drivers don't count.
Doh....I just looked for the latest nvidia driver and saw that it was slow.trinibwoy said:Huh - did you not see the 76.44 WHQL score? The DNA drivers are based off of that driver.
jebo_4jc said:Doh....I just looked for the latest nvidia driver and saw that it was slow.
Edit: Wait, why are all the subsequent drivers much slower? Were the 76.44 a bug?
Shadow27 said:Honestly sometimes I think the Inquirer's articles are written by chimps. The grammer is so bad.
Cool if true though
Shadow27