ATI Radeon HD 2600 XT @ [H]

Even in the absence of competition Intel does have reason to keep the processor pricing in check to a degree. If they make them too expensive then people won't buy them or at best they'll only by the lowest end and cheapest.

Erm, they could sell the lowest end and cheapest CPU at what ever price they want since people will have no other choice but to buy it anyway. As long as AMD is still around even if their CPU is not competitive, people will still buy their CPU if they keep their price low. HD 2600XT is not competitive but as long as it is cheap, the card will still have a market. As soon as AMD/ATI is gone, price will start climbing up and developement will be slower until a new company comes in and gives some competition even in the low end segment.
 
Erm, they could sell the lowest end and cheapest CPU at what ever price they want since people will have no other choice but to buy it anyway. As long as AMD is still around even if their CPU is not competitive, people will still buy their CPU if they keep their price low. HD 2600XT is not competitive but as long as it is cheap, the card will still have a market. As soon as AMD/ATI is gone, price will start climbing up and developement will be slower until a new company comes in and gives some competition even in the low end segment.

I agree. Development will slow and prices will climb. My point is that Intel might keep the prices somewhat reasonable in order to sell more chips. If they raise the prices too high people will sit on thier systems and CPUs longer than they might normally. Intel doesn't want that. They want you to buy new CPUs every couple of years at the least.
 
The R520/R580 was completely different, even down to the memory controller, it wasn't relying on their old R300 technology at all, and it kicked some serious ass, so calling out ATI on having one subpar performing product as an indication of their future really doesn't have much ground to stand on.

My mistake on the R5x0. Thanks for pointing that out. Though I still don't change my assessment of the future.

ATI really messed up with R6x0. They need a significant redesign that won't come anytime soon. Meanwhile NV is reportedly going to roll G9x with a huge performance increase this winter, so it looks like NV will again own 2008 like they owned 2007.

Also consider that on Paper the R600 should be taking no prisoners, with its bazillion shader units and 512 bit bus for massive memory bandwidth. Yet it falls down. Where does it go from here?

I don't think that AMD, as a company with a three direction focus (CPU/GPU/Fusion), can be as sharply focused on graphics as NVidia. NVidia who already has a lead and will be putting everything into maintaining it.

Though I hope ATI gets back in the game. I don't think the lack of competition is affecting release frequency from NV, but I think it is keeping prices higher.
 
Yea it'll keep prices higher, and speed increases lower (moving from a GF2 to a GF3??).

The R600 definetly is a mystery, it has ALOT of features and some good for future development, as well as a beefy architecture, the leakage it has most likely isn't the reason for failing performance, as it stands on paper it should be just below a GTX with the way they are using vec5 instead of scalar (throwing more ineffecient shader processing at less more shader effecient should equal out performance right?), but at the same time something isn't working.

Its been reported but not confirmed that the AA is ran through the shader work, which uh... doesn't make sense at this stage in the game, and ATI isn't dumb either, so my guess would be something went broke and ATI had no time whatsoever to fix it and implemented a work around.

But they definetly won't go under, their 2400 and 2600 have plenty of good selling points and should sell well along side NV's offering saving them some money.

Either way the R600 isn't a company ending product either, it isn't great, but its not horrible either, they are just bleeding money on that one product right now/
 
The biggest design flaw from a personal perspectiv is the poor AA performance. This is the feature I most want to turn on. I hate jaggies/shimmer. I will shut off fancy lighting/shadows, turn down textures, to be able to turn up AA. Weak AA guarantees I will go for the competition.

It almost seems like they came up with theoretical models about how they could squeeze out AA performance from shaders and the theoretical calculations were broken, so the performance never materialized (and never will) in the real world.

I see this as hurting because AMD never got to claim that new technology premium, that initial high price as you roll out new technology, on this series. The 512 bit interface makes the cards more expensive to build, yet they really can't charge a premium. So smaller profit margin.

At the lower end the 2600XT really only competes with the 8600GT so they can only get that kind of pricing.

AMD is also giving away Athlon X2s at fire sale prices. They are being squeezed from both sides (CPU/GPU).

I don't think AMD will croak anytime soon, but if the next GPU/CPU iteration again falls to also ran status and brings smaller margins, there will be trouble. You can't squeeze by on low margin under performing parts on both sides of the business for long.
 
I agree again, what exactly are we debating about? hehe

AMD rode the A64 boat way too long, that is their bad, ATI made a bad decision with the R600, again something they can fix just not on current hardware.

How they get out of it is up to them, but these mistakes, that is if management is smart enough to see them, will be fixed next generation, and AMD should have parts out that will compete.

They went over some pretty massive CEO re-management recently so hopefully there pitfalls are gone.
 
No, its pretty clear that something is broken/not working on the R600, mainly the AA engine, seeing as it does worse on AA then ATI's prior GPU I would definetly say it broke and ATI ran out of time to fix it.
 
No, its pretty clear that something is broken/not working on the R600, mainly the AA engine, seeing as it does worse on AA then ATI's prior GPU I would definetly say it broke and ATI ran out of time to fix it.

QFT. Amd had to release something, and broken r600 it is. That is the reason for no xtx version. If the silicon had been good the xtx would have "owned" the gtx/ultra, xt would have "owned" gts. But alas, when it rains it pours.

No I have no "proof" of this per say, but looking at the way things turned out it is logical...

And one more thing, Barcelona/phenom is having the same problem, along with 65nm leakage problems...


Ply
 
Barcelona isn't really a leakage problem, more or less their clocks can't get over a level with out breaking the CPU, they need to hit 2400 on air with out any problems, other wise their CPU won't be a contender and will just be a replacement for the X2 at the same price (not a good position to be in)
 
Good thing this is the ATI section, I would hate it if there was another place where all the posts about CPUs could go.
 
It kind of turned into a "The things AMD is having trouble with" thread, back on topic

the R600 was a serious stepping stone and wakening call for ATI, it was a huge step in every direction that they've taken before and they just had poor planning for it, now if ATI/AMD want to continue broadening their GPU (especially in transitor count) they will have to take it more seriously instead of making it an all out awsome GPU then fucking up on performance because their attention or intentions were elsewhere.
 
I've been a long time [H] reader. I expect straight shooter reviews devoid of bull. And this review was, except for this at the bottom of the part for Oblivion:

"Paying close attention to the cobblestone texture on the path in front of the camera, you can easily see that the texture filtering is sharper using 8x AF on the XFX GeForce 8600 GT XXX than using 4x AF on the ATI Radeon HD 2600 XT."

EASILY? The difference is so minute that anyone actually playing the game would never notice this. Come on [H]. I normally expect you to gloss over this crap.
 
I've been a long time [H] reader. I expect straight shooter reviews devoid of bull. And this review was, except for this at the bottom of the part for Oblivion:

"Paying close attention to the cobblestone texture on the path in front of the camera, you can easily see that the texture filtering is sharper using 8x AF on the XFX GeForce 8600 GT XXX than using 4x AF on the ATI Radeon HD 2600 XT."

EASILY? The difference is so minute that anyone actually playing the game would never notice this. Come on [H]. I normally expect you to gloss over this crap.

In gameplay the difference may not be that great, but it is easily seen in the screenshot provided.
 
My new card in packaging. (Check bottom for the price I payed and link to get one.)
l_a11478a307d15c7dadad34a885d7fede.jpg

Size of packaging.
l_e53e78f2a33f0632212f18caf4944444.jpg

Compared to my back-up card "Radeon 9200"
l_b60d46697cbe6665251bd12fb9055cc2.jpg

Back of the card (Heat Sink for ram)
l_5e0cd77ff8ebe1af7e8bc6eec6625f82.jpg

Make sure there is proper ventilation.
l_8b8c0384e7795041fda95e28bb87ec8b.jpg

Image of the back of my computer. "I suggest a ageia physx card for single GPU people"
l_77067729449d89df59f9f26db125eec5.jpg

Games for Ageia Physx: (Click to play the corisponding movies of the game)
Switchball, Monster Madness, Cell Factor, Crazy Machines 2, Rail Simulator
I purchased my card from BLT.com (Click here for product sales page) $175.77 (NOTE: drivers are still under development and will need atention.) Runs fine though.
And for all other cards I got the best prices from PriceGrabber.com
-~keywords: images of hd2600XT , images of visiontek hd2600xt , hd2600xt agp , hd2600xtagp, BLT Item #: BN18221 , Manufacturer: VISIONTEK , Mfg. Part #: 900184 , UPC Code: 784090022455
Thanks alot for looking!!!
 
http://overclockers.com/tips01182/

65nm is killing Amd. They are getting better but it has not been a easy road. This compounds barcelona's other problems...


Ply

How does AMD's own 65nm process have anything to do with TSMC's? They don't produce their own GPU cores (yet), and if anything, history tells us that although AMD is slower to get to grips with a new process than the other guys, they sure the hell can squeeze so much more out of it, and with much better results. Besides, this was around a year ago when they started producing on 65nm, and this isn't the case anymore. How else would they be able to run a 3GHz Phenom X4 with the standard heatpipe HSF (some would argue it's "cherry-picked" but that's not the point, and it validates my argument that you think all 65nm parts are hot leaky bastards...)?

Do your homework before you spew forth ignorance.
 
Back
Top