BravO)))
Supreme [H]ardness
- Joined
- Jun 16, 2008
- Messages
- 6,635
Yeah, I figured they either had to be n00bs or trolls. I just felt like ranting
BTW, I'm a dudette
Pics or shens.
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Yeah, I figured they either had to be n00bs or trolls. I just felt like ranting
BTW, I'm a dudette
There are few of us here, but we sure like to get heardSorry dudette!
Good to finally see some representation from the opposite sex!
Pics or shens.
Yeah, I figured they either had to be n00bs or trolls. I just felt like ranting
BTW, I'm a dudette
this really was their best strategy. scalability and market placement. still there was a point there, they had most of the items needed for DX11 and the easy scale design. it was pretty slick. now why can't AMD take this from ATI and move it to the cpu division?
I kind of miss the K7 days when AMD managed to completely catch Intel by surprise The K8 was a nice evolutionary step from there, but after that things just kind of... didn't do much other than going back to AMD being in the shadow of Intel. Really a shame, I wish Intel got some real competition from AMD instead of the latter merely nibbling away at the mainstream and budget markets.
Neither a noob nor a troll. I just don't respect anything at all coming out of Nintendo lately, especially not a 2001-ish era polished turd with fancy gimmick controls. Hollywood and Broadway are nothing more than clock increased, process shrunk Gekko and Flipper, respectively -- which is fine considering Nintendo's market. The market that the Wii caters to is just fine with having to rebuy SNES games on their GBAs, N64 games on their DS and through VC for ridiculous prices.
I stand by my assertion that Nintendo's R&D was minimal in comparison with the other console manufacturers, especially given
1) the fact that the console was sold at a profit from launch
2) 100% backwards compatibility with previous generation hardware AND leaked press reports indicate an evolution rather than a new design
Neither a noob nor a troll. I just don't respect anything at all coming out of Nintendo lately, especially not a 2001-ish era polished turd with fancy gimmick controls. Hollywood and Broadway are nothing more than clock increased, process shrunk Gekko and Flipper, respectively -- which is fine considering Nintendo's market. The market that the Wii caters to is just fine with having to rebuy SNES games on their GBAs, N64 games on their DS and through VC for ridiculous prices.
I stand by my assertion that Nintendo's R&D was minimal in comparison with the other console manufacturers, especially given
1) the fact that the console was sold at a profit from launch
2) 100% backwards compatibility with previous generation hardware AND leaked press reports indicate an evolution rather than a new design
Less R&D than the X360 and PS3? Probably. "little to no R&D expenditure was required" is quite far from the truth if you had read the interviews with the actual people involved with the R&D process. There is the entire brainstorming process, mockups, piles of concepts and ideas which get half worked out to then be scrapped until eventually a more concrete idea surfaces. It's not like Nintendo started with the 'Let's make a supercharged GameCube!' and slapped some PCBs and cases together.
Hindsight makes things look too bloody obvious at times, but I can assure you that at the beginning of an R&D cycle it is everything but.
I'm surprised ATech isn't jumping in on this like a rotton diaper on a baby.
Here's what will happen (my guess): They will have an actual card at CES. No one will be allowed to touch it. No one will be able to see it run benchmarks. In other words, it will be a highly detuned Fermi beta engineering sample until they get the last of the kinks in hardware/software out. Wouldn't be the first time a hardware vendor did this.
He's prob in his green pajamas right now dreaming about JHH..he'll be here soon. And lol @ that blind fan early in this thread talking about double x2 version Fermi card..sorry this round its going to be impossible due to power restrictions.
sorry this round its going to be impossible due to power restrictions.
He's prob in his green pajamas right now dreaming about JHH..he'll be here soon. And lol @ that blind fan early in this thread talking about double x2 version Fermi card..sorry this round its going to be impossible due to power restrictions.
While I'd tend to agree with you on this, I can't help but recall when many people were saying this about GTX 200-series as well and they were able to pull that off obviously..
Can just imagine Jen-hsun in a ninja suit sabotaging TSMC machinery.
Looks like Pharaoh beat me too it. I recall several articles claiming the GTX 295 would be impossible to make.
Care to fill us in on your insider info on Fermi power consumption?
It was impossible, or at least not reasonably feasible, til the die shrink many months later, and even then it was a pair of 275's and not 285's. Not saying they can't do it without another die shrink, but this chip is supposed to be significantly larger than the 200 series. We will need to see what sort of wattage it will actually have to dissipate b4 we can really make a guess at whether or not it's really doable. I don't know about you guys, but I never want to see triple slot coolers, or water cooling as a requirement.
No insider just speculating. If it has more transistors than a 5870x2 which is rumored to be near 300w then its logical to assume it wont make restrictions for pci-e.
Ok taking devil's advocate for Fermi, I can see them taking the shrink down to 32nm to make this happen just like they shrunk down gt200 process to make the gt295. With TSMC current problems on 40nm....I wouldn't bet on it.
Of course feel free to hope and by the time it comes out I'm sure AMD's single gpu 6 series will come around a week later to beat it by at least 10-15%.
I thought TSMC was just going to skip over 32, and shoot straight for 28?
Would be cool if nVidia targets this with Fermi, or ATi does this with the 5000 series refresh (for the top end cards, the lower ends don't count, as much).
No insider just speculating. If it has more transistors than a 5870x2 which is rumored to be near 300w then its logical to assume it wont make restrictions for pci-e.
Umm... that's simply not true. The power doesn't pass through the PCI-E slot so it doesn't matter. There is nothing stopping Nvidia or ATI from putting 10- 8 pin power connectors.
What DOES matter is heat. It is harder (not impossible) to put out a card with a higher TDP.
Last I read AMD was going for 32 nm for their 6 series with Global Foundries who is targeting that process for them which is why TSMC promised 28nm to counter. http://xtreview.com/addcomment-id-10092-view-Globalfoundries-32nm-reported-for-later-period.html All reports so far is neither TSMC or Global Foundries will ready for anything less than 40nm until 2010. Hey that's probably when they will release GT395! sweet
That is sweet.
Now imagine the competion that will stem from this!
Last I read AMD was going for 32 nm for their 6 series with Global Foundries who is targeting that process for them which is why TSMC promised 28nm to counter. http://xtreview.com/addcomment-id-10092-view-Globalfoundries-32nm-reported-for-later-period.html All reports so far is neither TSMC or Global Foundries will ready for anything less than 40nm until 2010. Hey that's probably when they will release GT395! sweet
that is for CPU, not GPU I believe...
about 75W can, but most cards now draw less then that from the PCIe slot, just in case / poor motherboard power distribution / multiGPU maddness.
iirc, it's 75W for a 6pin, and 150W for a 8pin (GPU, not the mobo connector).
Occasionally, I like to throw a grenade into a conversation, then run away. This is one of those times.
http://74.125.113.132/search?q=cache:XvRHZt7d-3cJ:www.brightsideofnews.com/news/2009/10/30/opinion-are-all-amd-fans---idiots.aspx+%22are+all+amd+fans+idiots%22&cd=1&hl=en&ct=clnk&gl=us&client=firefox-a
Yes, they can draw 75 watts through the PCI-E. I was referring to them not being limited by the PCI-E for power draw. Most of the power comes from the 6/8 pin connectors used. And tbh, I'm not sure currently how much they are actually drawing through the PCI-E slot. I know they can draw some because if you boot without attaching the others you will get a video output telling you to connect the others.
It could be that 3.0 offers 300W (this is just me speculating).max... like in everything added up?
The slot only gives 75W, iirc. Unless if 3.0 changed this?
It could be that 3.0 offers 300W (this is just me speculating).
PCIe 1.1 = 75W max
PCIe 2.0 = 150W max
PCIe 3.0 = 300W max now?
Edit - Yep, just checked, as part of the specification for 3.0, it should provide 225/300W of power from just the slot itself.
Check under specifications Last time I checked the 295 isn't a PCIe 3.0 card so I guess it can't exist.Nvidia.com_GTX295 said:Maximum Graphics Card Power (W) 289 W
No insider just speculating. If it has more transistors than a 5870x2 which is rumored to be near 300w then its logical to assume it wont make restrictions for pci-e.
Ok taking devil's advocate for Fermi, I can see them taking the shrink down to 32nm to make this happen just like they shrunk down gt200 process to make the gt295. With TSMC current problems on 40nm....I wouldn't bet on it.
Of course feel free to hope and by the time it comes out I'm sure AMD's single gpu 6 series will come around a week later to beat it by at least 10-15%. Repeat of the current GTX295 - 5870 situation.
q3 call said:Jen-Hsun Huang
So let me see if I can break that down a bit -- I expect GeForce to grow next year, not only because the market will be healthier next year. There is real evidence that GPU adoption is increasing. There is -- and the enthusiasm behind FIRMY, our next generation GPU architecture, is just out of this world. I mean, it’s just way over the top. And the reason for that is this is because the first brand new architecture we have created in four years and instead of an incremental change to DX11, this is a fundamentally new architecture and the performance is fabulous. And so we are expecting to be very successful with FIRMY and all of its derivatives.
Glen Yeung - Citigroup
Jen-Hsun, I just need you to clarify one thing, if you might -- when you say performance of FIRMY is great, are you saying that relative to your previous architecture or also relative to the current competition?
Jen-Hsun Huang
Both.
Glen Yeung - Citigroup
Great, thanks.
David Wu - Global Crown Capital
Okay, that’s good. Jen-Hsun, you showed -- I was at the show when FIRMY was previewed and you showed very good numbers on the computing side. Relative to your competition that is shipping products, you didn’t talk anything about the graphics side -- should I assume that the graphics performance is equally superior to the competition that is shipping right now?
Jen-Hsun Huang
We didn’t announce anything on graphics because it wasn’t graphics day. When we announce GeForce and Quadro, we are going to talk about the revolutionary graphics ideas that are designed into FIRMY and so we are looking forward to do that in the near future. And so please be patient with us -- the market is anxiously waiting and we have enthusiasts all over the world that are waiting for us to ship it. You know, every four years or so, we revolutionize the GPU with a brand new architecture. If you remember the G80, what became the GeForce 8800, was probably one of the most successful products in the history of our company and we’ve been doing incremental changes to G80 since then.
And now with FIRMY, it’s another revolutionary architecture and I’m expecting to take the GPU market up another notch.
Fermi mentions in today's Q3 earnings call--
http://seekingalpha.com/article/171...-earnings-call-transcript?source=yahoo&page=6
lol every 4 years nvidia creates great arch then renames it for another 4 yearsFermi mentions in yesterday's Q3 earnings call--
http://seekingalpha.com/article/171...-earnings-call-transcript?source=yahoo&page=6
Jen-Hsun is the CEO.
And relative to the competition, the market has really spoken -- although its a fast chip, its not that fast and it is basically an RV770 with DX11.