NVIDIA Interview: The Sky Isn’t Falling

HardOCP News

[H] News
Joined
Dec 31, 1969
Messages
0
I'm not sure why people feel the need to explain that it's "no longer possible" for consoles to have better graphics than a PC. Honestly, when has that ever been the case?

Tamasi: It’s no longer possible for a console to be a better or more capable graphics platform than the PC. I’ll tell you why. In the past, certainly with the first PlayStation and PS2, in that era there weren’t really good graphics on the PC. Around the time of the PS2 is when 3D really started coming to the PC, but before that time 3D was the domain of Silicon Graphics and other 3D workstations. Sony, Sega or Nintendo could invest in bringing 3D graphics to a consumer platform. In fact, the PS2 was faster than a PC.
 
Also when was the PS2 ever "faster than a PC?" If I remember correctly, in 2000 when the PS2 launched it had a 299Mhz processor while computers were running 2GHz Pentium 4s with Voodoo 3 or TNT2s.
 
Well, the debate is still raging as to whether or not you're going to need a terabyte of video memory...
 
Also when was the PS2 ever "faster than a PC?" If I remember correctly, in 2000 when the PS2 launched it had a 299Mhz processor while computers were running 2GHz Pentium 4s with Voodoo 3 or TNT2s.

which looked FAR better than a ps2
 
heh, makes me wonder if his thoughts are biased due to he fact that all consoles are now powered by AMD
 
I'm thinking that back when the Atari 2600 came out, it may have had better graphics than the PCs that existed at the time.
 
heh, makes me wonder if his thoughts are biased due to he fact that all consoles are now powered by AMD

Maximum PC just did a hypothetical battle in this month's issue ... Sony has the best graphics performance at 1.8 teraflops ... even a mid range PC can crank out upwards of 4 teraflops (in either an AMD or NVidia configuration) ... high end PCs can do even better ... if you only want to consider unit cost (not life or software costs) then consoles can win the battle with PCs ... if you look at total cost vs performance (including life and software) then PCs will win hands down (regardless of whether you use AMD or NVidia) ;)
 
Until 3DFX came to the scene I don't think one could claim that PC's had better graphical performance than a console unless you want to argue 2d graphics possibly?
 
PS2 was launched in March of 2000 according to ol' wikipedia.

I was running a P3-800mhz OC'ed to 866 (iirc) with a Creative Labs flavor of the original GeForce-256 (T&L BITCHES!) cranking some 150fps in Counter-Strike Beta 6 or 7.

A PS2 was not f'n faster than my PC. Ever.
 
People need to stop arguing about what PC's are capable of doing because that's not the point here.

If we're talking 720p or possibly even 1080p gaming, some of the console games are going to have features and graphics that not everyone on PC has. Not everyone on PC has 8 GB of memory to work with, and not everyone on PC has a modern feature GPU, and not everyone on PC has hardware physics, and not everyone on PC has an 8 core CPU. Everyone with next gen consoles do have those things.

Of course PC overall has more capability and more power, but when it comes to the lowest common denominator, at least for now, next gen consoles are going to do some amazing things with graphics that many lower end PC systems cannot do.
 
Also when was the PS2 ever "faster than a PC?" If I remember correctly, in 2000 when the PS2 launched it had a 299Mhz processor while computers were running 2GHz Pentium 4s with Voodoo 3 or TNT2s.

IIRC the PS2 could do about 6 gflops and the vector units on the PS2 were faster than anything else from pc or console. Too bad it was a nightmare to use both.
 
Those guys must have smoked some good shit before that interview cause they're all fucked up.
 
Bet you he would not be happy to hear I am going to purchase an R9-290X then? :D Yeah, since they did not win any of the console contracts, my guess this is damage control.
 
Also when was the PS2 ever "faster than a PC?" If I remember correctly, in 2000 when the PS2 launched it had a 299Mhz processor while computers were running 2GHz Pentium 4s with Voodoo 3 or TNT2s.
It's kind of silly to compare a RISC processor to an x86 processor by clock speed (in fact, it's kind of beyond silly), but you still make a legitimate point.
 
Wow, nvidia seems to be going in full damage control counter-marketing?!

Liked the bit about admitting to console efficiency but saying 2x, not 3x... so if nvidia is lowballing 2x, reality might be 2.5x for argument's sake, then that raises the ps4 line to an interesting level... ;)
 
IIRC the PS2 could do about 6 gflops and the vector units on the PS2 were faster than anything else from pc or console. Too bad it was a nightmare to use both.

Company benchmark quotes that spew out "gflops" and "mips" and "bips" figures like this are meaningless when it comes to gauging actual game performance--and that's why they use those numbers instead of information on actual game-performance frame-rates and resolutions, eye-candy effects, and so on....;) This is especially "good marketing" to consolers because they're going to think such numbers are significantly important far more than computer-system consumers. Benchmarks like these are nothing but marketing tools and do not say much of anything about actual performance while running game software. It is a good idea to dismiss them out of hand when shopping, because actual gameplay frame-rate numbers, resolution information, along with eye-candy performance numbers (FSAA, etc.) will provide the actual kind of information that is of some value to a consumer.
 
From what I remember consoles tend to fit into 2 different categories.
1: Easier to program for, and less powerful than PC's (DC, GC, Xbox, etc)
2: Powerful and hard to program for (Saturn, PS2)

Otherwise the only things that make console games look better is the tricks they use to do it. For example the SNES was made to display 3 layers of graphics at a time, which helped make games look good, especially when dealing with transparency (like fog) at a lower performance impact.

Now that consoles are finally using more PC based hardware, the comparisons are slowly beginning to be apples to apples, both in development and performance. And many people are getting hard-ons because of that.
 
conventional wisdom: history is written by the winners

apparently that is no longer the case
 
Trivia Tidbit: In reference to my Amiga post above, if editing here had been possible I would have added that I remember the time when all computer game boxes displayed only Amiga screenshots on the back--the fine print read something like this: "Screenshots pictured above are from the Commodore Amiga version of this game. All other versions may vary." Indeed, CGA(4 colors) and/or monochrome "versions" looked nothing at all like the Amiga's 32-color versions--although the Amiga had higher color capabilities even at that time, the 32-color mode was the one most used by developers, IIRC.
 
I'm pretty much knocked out due to illnesses so scuse me if it was answered, the PS2 faster than PC thing still applies today but it has jackshit to do with Nvidia, it's about the RDRAM the machine has, remember Rambus? Not even DDR4 comes close to those data rates of course. Tricky business to emulate, games that relied upon the speed don't run right even in the official (yet secret) PS2 emulator code that Sony has for all PS3s.
 
And one great fun fact about the saltiness of Nvidia with the new consoles and AMD is that Nvidia sat and waited for Sony/MS/Nintendo to come to them, they've got previous-Apple/modern MS-like delusions of grandeur and need a change of leadership unless they're actually doing fine with the Tegra & server business.
 
Thanks for this blast from the past. I actually remember reading this when he first posted it. My first video card was a Riva TNT. It def shat on the voodoo3's that were popular at the time.

IIRC, the TNT's only claim to fame was 24-bit (32-bits?) D3d color support, whereas the V3 was limited to a hybrid 16/22-bit mode (3dfx called it that because internally the V3 calculated out to 32-bits while output was rounded at 16-bits.) I had a TNT installed with my V2 way back when, but when I bought a V3, I yanked both the TNT and the V2 out and replaced them with a V3. The V3 ran circles around that combination. In those days doing without 3dfx immediately classified someone as a casual gamer because ~90% of the games you could buy were GLIDE games (D3d as an API didn't catch up with Glide until DX7 or so, and the only thing a TNT supported was D3d--and almost no 24-bit D3d games even existed at that time.) Put the TNT in 24-bit mode and watch it crawl...;)

It wouldn't be until the TNT2 that things started to get more interesting in regards to competition with 3dfx, imo...;)
 
What the hell is that guy talking about? 3D wasn't big when the PS2 came out? Who writes this crap?
 
And by the time the SNES came out PCs had pretty VGA graphics with games available like The Legend of Kyrandia!

http://www.myabandonware.com/game/the-legend-of-kyrandia-1rk

That's very true...;) The only thing I couldn't figure out relative to VGA is why developers never supported the ~262,000-color mode but stuck with the 8-bit 256 colors, instead (or maybe that was SVGA--can't recall.) C= had one more act up its sleeve, though. About that time C= introduced the AGA mode, which was very similar to VGA (with an 8-bit 256-color mode), except that in the Amiga's HAM8 mode you could get actual 24-bit color displays--whereas VGA couldn't) but by then C= was fading as a company and I can't recall any developer actually doing an AGA game--which was unfortunate for me at the time since I had an A4000 AGA machine...!
 
I'm pretty much knocked out due to illnesses so scuse me if it was answered, the PS2 faster than PC thing still applies today but it has jackshit to do with Nvidia, it's about the RDRAM the machine has, remember Rambus? Not even DDR4 comes close to those data rates of course. Tricky business to emulate, games that relied upon the speed don't run right even in the official (yet secret) PS2 emulator code that Sony has for all PS3s.

The RDRAM in the PS2 ran at 3.2GB/s, which is what single-channel DDR1 PC-3200 runs at.
What were you saying about DDR4 not running at those speeds? ;)
 
IIRC, the TNT's only claim to fame was 24-bit (32-bits?) D3d color support, whereas the V3 was limited to a hybrid 16/22-bit mode (3dfx called it that because internally the V3 calculated out to 32-bits while output was rounded at 16-bits.) I had a TNT installed with my V2 way back when, but when I bought a V3, I yanked both the TNT and the V2 out and replaced them with a V3. The V3 ran circles around that combination. In those days doing without 3dfx immediately classified someone as a casual gamer because ~90% of the games you could buy were GLIDE games (D3d as an API didn't catch up with Glide until DX7 or so, and the only thing a TNT supported was D3d--and almost no 24-bit D3d games even existed at that time.) Put the TNT in 24-bit mode and watch it crawl...;)

It wouldn't be until the TNT2 that things started to get more interesting in regards to competition with 3dfx, imo...;)

I think I might have mispoke. I remember my roomate getting a fancy new Voodoo card and I was jealous till I bought my TNT. So I think he actually had a voodoo2 and not a voodoo3. This was 1998 when I was a freshmen in college.
 
Ok, I maybe give them the PS1, but the PS2 had the Geforce 2 Ultra, which was the GTX Titan of it's time.
 
heh, makes me wonder if his thoughts are biased due to he fact that all consoles are now powered by AMD

I'd be willing to suspect that, but some of his reasons are pretty sound. Especially the power argument. The consoles have to be living room friendly and cost conscious. Power is a finite resource in that instance, and right now there's only two markets where 3d acceleration is evolving seriously, and that is with the $400+ desktop cards and in mobile devices. Previously, consoles could combine the benefits of a really tweaked driver while pulling parts from the midrange segment, but that segment is stepping outside the console power envelope these days. The mobile chipsets aren't good enough to even leave graphics in the same spot on a generational refresh.

There's jsut no real competition in a market segment that refreshes every year or two that 10 year console lifespans can benefit from right now except perhaps laptop targeted chipsets and integrated graphics. Unless PC gamers drastically change their approach to gaming rigs, or that market becomes less viable, I don't see that segment advancing fast enough to matter for consoles. If mobile GPUs do, the console is dead. The only other opportunity I see is if you get the likes of MS deliberately keeping a solid revision of direct X out of the pc market like with the original X box, while open GL thrashes around ineffectively for years on end, AND you also do not have mobile chip-sets performing well that aren't a hobbled by API BS at the time nipping at the mid level laptop range of performance.

With mobiel devices failing to see huge leaps of usability and awesomeness between generations anymore, I don't expect 3d performance to be ignored in feature wars. Especially with things like ouya and steamOS and MS's chinese household set top box with games, all the streaming to a thinner client gaming services being floated, etc. There just isn't going to be a chance for console makers to create an artificial advantage anytime soon, and a natural one that drags raw GPU performance with it doesn't appear to be looming.
 
Hmm, maybe when SNES first came out?

Outperformed by the Sharp X68000 series of computers (Japan), which released in the late 80's.

At no point were consoles ever more powerful then home PC's*.

*In Japan.
 
My PC from the PS2 era certainly blew it away, BUT my PC was certainly not a mainstream PC either.
That was the era when good 3D cards weren't in most PC's. I'd take a PS2's 3D over almost anything off the shelf in Best Buy, Circuit City, etc. from that era.
 
I'm pretty much knocked out due to illnesses so scuse me if it was answered, the PS2 faster than PC thing still applies today but it has jackshit to do with Nvidia, it's about the RDRAM the machine has, remember Rambus? Not even DDR4 comes close to those data rates of course. Tricky business to emulate, games that relied upon the speed don't run right even in the official (yet secret) PS2 emulator code that Sony has for all PS3s.
What are you smoking? RDRAM was running out of performance advantages by the time the PS2 came out. It only held a slight performance advantage for a couple of years when SDRAM was popular around 1998-2000. Once DDR came out it lost all advantages. DDR SDRAM was finalized in June 2000, only a few months after the PS2 was released.

Early Pentium IV's used Rambus RAM. It also was not that fast. DDR266 was almost as fast and DDR333/DDR400 surpassed it and that was in 2001-2002 hence its discontinuation by Intel in all future chipsets. RDRAM also had higher latency and put out a lot more heat.

The only thing I can think of that you are referring to is XDR RAM which is also a Rambus technology... but it wasn't around when the PS2 came out.
 
Back
Top