to bad ati didnt keep somthn back to b enabled later --> 5870,5850

NeonGuy

[H]ard|Gawd
Joined
Aug 28, 2000
Messages
1,654
Was thinking its to bad that ati didnt secretly keep some performance back from all reviews , only to be enabled via bios update ,eg [200 more cores = 1800 ] giving us more performance and opening a can o woop azz on nvidia at the right moment .
Niow i know with future driver updates we will prolly see another 5-10 % across the board but..would be cool.
 
seriously?!?

Given their current yields I can almost bet that they are maxed out now.

And I'm sure nVidia is going to do the same thing. They look at ROI and try to find that sweet spot. Obviously they will try to get close or surpass ATI. That's a given.

If ATI ups their specs later in the production cycle, then nVidia is just going to delay their product till they get something close or better.

You can be rest assured, I'm sure each camp has their spies secretly loaded down inside each factory and know exactly what the other is producing for XYZ timeframe. I mean how many times have team green come out and said, "we're giving you features M N O" and team red turns around and says, "We've been working on the same features M N Ofor months now. Let me show you are beta of it" (And vice versa)
 
Last edited by a moderator:
With some tight binning of chips, they can release a new card that can run at 1GHz+ at stock. Many 5870s are getting there with just a slight voltage adjustment. And from what I've read on several forums, the performance increase scales pretty well. And, they are already working on getting 32nm chips, so they can have a new card in development with more shaders that might be able to be released around January/February. Of course, this is all just speculation on my part, and I could be/probably are way off target.
 
I wonder why ATI hasn't unlocked the shader clock like nvidia... Does the architecture not allow it ?
Cause 1400mhz shader core would see a huge improvement
 
And, they are already working on getting 32nm chips, so they can have a new card in development with more shaders that might be able to be released around January/February.

This would be very untypical for ATI. ATI has done two things during the spring refreshes:

1) Release a faster clocked version based on the old nm process
2) release a RV (Radeon Value) version testing the new smaller nm process.

The engineers learn and refine the smaller nm process until it's perfected. That way it's easier to produce larger quantities of more nm chips in the future.

In other words:
1) Release process X
2) Refine process X
3) Tinker with smaller process Y on simple chip
4) Refine with smaller process Y for better yields.
5) Release more complex next gen with process Y
 
If ATI's architecture did allow for that and they did unlock it, Nvidia's cards would have to be much faster than they currently are. If that were in fact the case, HD 4000 would have been competing with GF300.

The shaders being seperate from the core in an Nvidia card is the obvious key to having lower core clocks and less stream processors while having a performance advantage. For ATI to adopt this methodology via a change in architecture would be a bit on the costly end, where their current one is much more extensible. Nvidia is going strictly for the amount of bang, where ATI is bang vs buck (not just for consumers, but inside of its product development as well). Why change something you can scale very easily through multiple generations that competes performance-wise?

I really hope these re-enable ATI rendering with Nvidia physx in Win7. My copy will be here on Oct 22 and I really can't wait.
 
This would be very untypical for ATI. ATI has done two things during the spring refreshes:

1) Release a faster clocked version based on the old nm process
2) release a RV (Radeon Value) version testing the new smaller nm process.

The engineers learn and refine the smaller nm process until it's perfected. That way it's easier to produce larger quantities of more nm chips in the future.

In other words:
1) Release process X
2) Refine process X
3) Tinker with smaller process Y on simple chip
4) Refine with smaller process Y for better yields.
5) Release more complex next gen with process Y

I do completely realize that this is how AMD goes about things. But I want to see AMD become more aggressive by utilizing their relationship with GlobalFoundries and take the Intel approach when it comes pushing manufacturing processes. I still think the "5890", or whatever they call it, will just be a highly binned chip with higher core speed/mem speed and voltage. I don't even think they will have to rework the chip's power delivery like they did with 4890.
 
why would you say that?

why would you say that it's "to" bad that ATI is giving all of their chips' performance to their customers up front from day 1?

are you so invested in the red vs. green battle that you'd prefer to have some small amount of performance of the chips that are out now kept from you until some abstract moment when ATI can say "booyah!"?

personally, the big reason i like ATI right now is because i haven't seen them behave like that. on the surface at least, they seem a lot more ingenuous in their business practices of late than nVidia has been.
 
why would you say that?

why would you say that it's "to" bad that ATI is giving all of their chips' performance to their customers up front from day 1?

are you so invested in the red vs. green battle that you'd prefer to have some small amount of performance of the chips that are out now kept from you until some abstract moment when ATI can say "booyah!"?

personally, the big reason i like ATI right now is because i haven't seen them behave like that. on the surface at least, they seem a lot more ingenuous in their business practices of late than nVidia has been.

They know they need to behave and build up a customer base. They don't have the market share to piss off all the potential customers. Plus, I think Nvidia has trademarked the "Graphics company will an oversized head stuck up its over-confident ass" market strategy.

P.S.---The last part was a complete joke. Call it a satire of an AMD Fanboi.
 
The problem with that idea is that as of right now, a constant for the industry is that GPU prices will inevitably fall, and thus ATI has to scramble to wring as much money as they can from early adopters. If they held back on performance, they can't convince as many early adapters to buy their new cards, thus they miss out on any extra profit they might receive from a new release.

TL;DR This idea is dumb.
 
Um first of all the 5870 blows the gtx 285 out of the water..that convinces many to upgrade to ati in my opinion.then 2 months later they recomend a certain driver or flash to a new bios and woops found another 10 % increase in performance on top of driver performance increases, thus creating a another 5xxx frenzy ! Keep the frenzy going yo. I mean really we have 2 companies trying to convince us thiere gpu's are better than the other, But do they just want my money or just be my friend ? There are better ways than just stepping all over on the little guy to get to the top. Think outside the box ! I think amd did by releasing cpu's that can be unlocked to quads. The way to peoples hearts $$$ , ati isnt extorting money like nvida . the gtx 280 was like 600$ when it first came out wasnt it ? Ati/amd doesnt have to have the fastest but the the dominate the afordable segmant to get to the top.
 
Last edited:
I have to admit it would be a good laugh if they did. Nvidia releases something faster then ATI provides an update that matches their performance at a much better price. Would sure suck for Nvidia but I don't think ATI has this planned. ;)
 
We could have changed the thread title to "Too BAD Nvidia didn't have a response ready for the 5800 series until 2010". And the conclusion would be the same, somebody's looking to TROLL.
Most normal people would want 100% performance out of the box, not a marketing 'gimmick' where ATI says, look we had more performance to unlock, we were just waiting to use it to trump Nvidia!
That makes no sense.
 
I'm actually upset that this card doesn't double as a tap, i mean just think about it a video card that serve you ice cold beer... mmmm
 
Last edited:
Back
Top