R600 Date

So, even with the little we know about the R600, we can come to the conclusion it is certainly something to take into a large piece of consideration when purchasing your next upgrade, be your strategy: wait for the R600 or, do something more like what I did, let the R600 put the heat on the G80, for your benefit.

I definitely think that either way the R600 is going to affect prices. Even if it is only as fast as the 8800 GTX (I can't believe I just said *that* - only as fast as a card that in some cases outperforms two of the previous generation cards!), at least there will be a few choices on the market. Competition, even if it doesn't directly lower prices, does encourage manufacturers to sweeten their deals, too.

I just wish I didn't have such a crappy card while I was waiting, but that's my own fault for being such a tight wad. My laptop plays everything well so the only pressing emergency I have to upgrade is the desire for something better in my desktop. Put in perspective, there is no pressing emergency for me to be able to play games better, but I sure want to!

The last information that I heard was that they will be showing at CeBit (March 11-13), but no performance numbers will be published at that time. Cards will be out in April. Source
I'm not sure how reliable their information is, but lately they have been the only site with any information on anything.
 
Talking about performance in today's titles and titles six months or so down the road (including Crysis). I ballparked the 15% figure by taking into account the rumours we've seen so far: fairly low shader clock speeds (no indication of disjointed clock speeds), 64 Vec3 ALUs, ~2.0 GHz GDDR4, 512-bit bus.

On paper, bandwidth is great. You want enough bandwidth for the subsystems to not get choked constantly, but going completely overboard tends to not be terribly advantageous. I think the 8800 GTX has bandwidth to satisfy, but we'd see some gains had it a true 512-bit bus. From a shader ops perspective, which is what really counts these days, R600 doesn't seem terribly efficient (G80's scheduler can feed all ALUs full-time, all the time, theoretically) or terribly fast (based on initial rumours). I expect that it will surpass G80 in raw shading performance slightly.

15% seems reasonable to me, and the performance should be fairly consistent. That would give AMD the crown for a while, but the 8900 GTX may be able to take it back very narrowly, and possibly do so for a lower price.


Thanks for clearing that up and I can see where you are coming from. In the past ATI has had some pretty balanced designs. I can not see them adding all that BW and not being able to use it. Not that I put blind trust into ATIs engineer. Just seems like a big waste of time, money to develop a 512bit bus, slap on fast GDDR4 ram and only be able to use some of it do to other bottle necks. I know ATI is holding lots of detials mum about the shader power so I guess we will have to wait and see. Strange things have happened before :)


the bandwidth advantage is only by the clocks of the vram and the extra 25% (and because of the burst rate increase the even higher clocks won't give the extact same increase of the increased clocks)

So I don't expect more then a 20% increase overall in most recent games, because of bottlenecks aren't fully bandwidth bound if AMD is using 1.0 ghz GDDR4 ram. Games like Oblivion and FEAR are more fillrate bound then bandwidth, COD2 and BF2 will show closer to the bandwidth % increase though.


Razor,
ok I see where you are at. I guess I have to go over that B3D thread as you had brought up fill rate too and see how this all plays out.
 
it's very simple, the other games that I play, don't need an 8800gtx
cs 1.6? lineage 2, and media player doesn't erally benifit from an 8800gtx, when more games come out that I will play and need more power, then i'll buy a new card again, but I'd rather have the $ in my pocket till then, plus im sure the 8800 and the R600 will be cheaper then =)

if that wasnt clear enough here is the short version, there was 1 game I WANTED to play and my card couldn't push it, so i bought the GTX, finished that game and sold the GTX 0.0

So, basically you bought a 8800 GTX to play one game...Ok.
So now when a new game, comes out, one that you really want to play and your X1950 Pro can barely run it, you'll buy the next big thing in 3D graphics, finish the game and sell the card.
I'm sorry, but that makes no sense to me. A 8800 GTX would last you much, much longer and you would be able to run any games in the near future, with max or next to max in-game settings, simply because the first gen DX10 games, are using the 8800s, in the games development phase. This is a fact.

In the end, it's your call of course. It's your money and you do what you want with it. I just didn't see much sense in what you did :)
 
So, basically you bought a 8800 GTX to play one game...Ok.
So now when a new game, comes out, one that you really want to play and your X1950 Pro can barely run it, you'll buy the next big thing in 3D graphics, finish the game and sell the card.
I'm sorry, but that makes no sense to me. A 8800 GTX would last you much, much longer and you would be able to run any games in the near future, with max or next to max in-game settings, simply because the first gen DX10 games, are using the 8800s, in the games development phase. This is a fact.

In the end, it's your call of course. It's your money and you do what you want with it. I just didn't see much sense in what you did :)

Why would i Keep the 8800GTX in my system when it's not being used for anything? I'd rather have the $ in my pocket then a paperweight in my PC? If anything when R600 launches the 8800GTX will drop down in price and I would have saved 1-200bux? it would still be a paperweight till some games that I might play? I jumped the gun and got the 8800GTX and then I realized that it wasn't worth keeping for 1 game :p right now there just are not enough games tha "I" would play that make keeping an 8800GTX worth keeping

ALSO
you and everyone else have no idea how DX10 games will play on an 8800,
 
Why would i Keep the 8800GTX in my system when it's not being used for anything? I'd rather have the $ in my pocket then a paperweight in my PC? If anything when R600 launches the 8800GTX will drop down in price and I would have saved 1-200bux? it would still be a paperweight till some games that I might play? I jumped the gun and got the 8800GTX and then I realized that it wasn't worth keeping for 1 game :p right now there just are not enough games tha "I" would play that make keeping an 8800GTX worth keeping

ALSO
you and everyone else have no idea how DX10 games will play on an 8800,

Buying a card like a 8800 GTX, is an investement into the future, which means, as I've said so in my previous posts, it will last you longer, much longer, than any card you have now.

As for DX10 games, it is a fact that the 8800 cards are being used in the development phase of many games (Crysis is a good example) and you can be sure it will run great on a single 8800. A good comparison is Half Life 2, in ATI's 9x00 days, mainly 9700 Pro, which was the first DX9 card in the market.
 
Buying a card like a 8800 GTX, is an investement into the future, which means, as I've said so in my previous posts, it will last you longer, much longer, than any card you have now.

As for DX10 games, it is a fact that the 8800 cards are being used in the development phase of many games (Crysis is a good example) and you can be sure it will run great on a single 8800. A good comparison is Half Life 2, in ATI's 9x00 days, mainly 9700 Pro, which was the first DX9 card in the market.


How is the 8800GTX an investment? I'll be losing money by keeping the card and NOT USING it for the next few months, it's just not useful enough for me to keep man, its farily simple, I don't know why you need to overcomplicate it, and don't ever call a top of the line video card an investment, investments are usually things that are
a) usefull to you
or
b) make you money
which the 8800GTX is not and does not do, not for me, and not for anyone else, its simply a convenience, a toy, and nothign else
if you still don't get it

buying a $700 videocard that you wont use until the same card is selling for $500 is a waste of money. So why would I buy it now and not later when I might actually use it? is there going to be a shortage of 8800 cards in 1-2 months that I don't know about?
If I was someone who played games more often, specially more demanding games, COH, COD2, FEAR, OBLIVION Etc then it would be worth spending that money now, but I dont' so its not
 
How is the 8800GTX an investment? I'll be losing money by keeping the card and NOT USING it for the next few months, it's just not useful enough for me to keep man, its farily simple, I don't know why you need to overcomplicate it, and don't ever call a top of the line video card an investment, investments are usually things that are
a) usefull to you
or
b) make you money
which the 8800GTX is not and does not do, not for me, and not for anyone else, its simply a convenience, a toy, and nothign else

As I said, in the end it's your call, not mine :)
I just thought it was strange and made my comment about it. But since we're so out of topic, let's leave it as it is.
 
As I said, in the end it's your call, not mine :)
I just thought it was strange and made my comment about it. But since we're so out of topic, let's leave it as it is.

deal =p

While I had it though I had to try some games on it and it was nice ^^ Lets see if R600 can make that nice a bit more affordable
 
deal =p

While I had it though I had to try some games on it and it was nice ^^ Lets see if R600 can make that nice a bit more affordable

That's also something I don't understand. R600 (the high-end models) will be as expensive or more expensive than the MSRP of a 8800 GTX.
And according to recent news, it's NVIDIA that will launch the low-mid range models of G80 first, than any R600 model.
 
That's also something I don't understand. R600 (the high-end models) will be as expensive or more expensive than the MSRP of a 8800 GTX.
And according to recent news, it's NVIDIA that will launch the low-mid range models of G80 first, than any R600 model.



The price of the 8800 will probably be lower then =p and you will have other models like the 8600Ultra which are very nicely priced. Also I read that R600 will launch with lower end models also. More options = more competition = better price for people who are buying

"AMD remains secretive about their R600 but we still managed to dig out a little more information. We can expect to see R600, RV610 and RV630 demos @ CeBIT but no performance tests to be revealed there. The R600 launch is slated for end of March it appears now and there are 3 SKUs; XTX, XT and XL. ATi briefly mentioned to their partners a 2nd generation Unified Shader Structure and a new hardware Crossfire for R600 family which can be bridgeless yet faster. We are not sure how does this new HW CF work out yet. The retail card is half an inch longer at 9.5" and the OEM/SI version remains the same at 12". It comes with 512-bit memory bus and 1GB DDR4 at more than 2GHz frequency. Core clock remains a mystery as yet." thats from another thread which quoted dailytech i think
 
Buying a card like a 8800 GTX, is an investement into the future, which means, as I've said so in my previous posts, it will last you longer, much longer, than any card you have now.
Famous last words for CIO/ CTOs: "computer hardware is an investment into the future. If we buy this mainframe now, we won't have to upgrade for decades"...

Just like someone in the CPU forums mentioned the other day:
"Nice computer you have there!"
Thanks, I built it myself - let me show you the insides... That's a high end dual core E6600 cpu there, and it's seriously overclocked.
"Wow, cool. What's that accessory there? I've never seen that before"
That? Oh - that. That's for futureproofing. It's a giant wad of cash.

If you want a G80 now, that is fine. You can use it and play all the games that you want with it now. That is fine and surely the right choice for you. The other poster does not see the need for a G80 now. As such, not owning a G80 is the right choice for him. I fail to understand how you know his needs better than he does?
 
Famous last words for CIO/ CTOs: "computer hardware is an investment into the future. If we buy this mainframe now, we won't have to upgrade for decades"...

I didn't say that and you know it. I said that a 8800 GTX is an investement into future, in the sense that there's nothing better out there and it will last much, much longer than anything else.
 
UPDATE: It seems that tests from level505 were just confirmed link: http://www.ddworld.cz/graficke-karty/atir600-alias-ati-radeon-x2800xtx-testy-potvrzeny.html
Its in czech. BTW I know this guy and his info is most of the time very reliable - according to this info tests were done by someone from Microsoft with WEAKER version of X2800XTX on very early drivers which were modified by tester.

*cough* my Czech is a bit rusty, but all the charts I see are from level505. What does he say in his text?
 
If you want a G80 now, that is fine. You can use it and play all the games that you want with it now. That is fine and surely the right choice for you. The other poster does not see the need for a G80 now. As such, not owning a G80 is the right choice for him. I fail to understand how you know his needs better than he does?

And I fail to understand what you don't get by my "I thought it was strange and made my comment about it". This is a forum you know. People exchange their ideas/opinions, with each other.

Anyway, too much OT, back to R600 Date.
 
*cough* my Czech is a bit rusty, but all the charts I see are from level505. What does he say in his text?

also, the handle that posted that only has two posts to his credit...


anyway, I think everyone has decided that the level505 "benchmarks" are bogus.
 
yeah I also thought that level505 test is fake but this guy usualy have very reliable info -he said that source of this info is from DFI.
 
If the information is so reliable why are you running around to every forum, creating an account, and posting links to it? Especially when it's just a combination of crap from all the fake sites piled together with absolutely nothing new added?
 
I posted it on just 2 forums - thats not like posting it to every forum i know:) :) :)
 
This SHIT is never gonna be released ..

Yes it will. For it to be released within any of the time frames that have been indicated, this card is already in production.

I read somewhere that AMD, now being at the helm, was not going to allow ATi to do another paper launch, and that there was to be quantity of product available when announced. It may not be on store shelves the day it's announced, but you won't have a few vendors with only a half dozen cards for the launch.
 
I read somewhere that AMD, now being at the helm, was not going to allow ATi to do another paper launch, and that there was to be quantity of product available when announced. It may not be on store shelves the day it's announced, but you won't have a few vendors with only a half dozen cards for the launch.


http://www.vr-zone.com/?i=4572

ATi has finally provided some updates to their eagerly awaiting partners regarding the schedule of their upcoming DX10 cards. R600 will be launched in CeBIT but cards availability in April and we heard that the official name could be Radeon X2900.

:eek:
 
The G80, and the R600 are two great cores, to be sure. We know that the G80 is almost ready for SM4.0 and the Unified Shader (NV dubbed: Stream Processor {SP}) driver, and as NV has stated many times the G80 can do Physics calculations thanks to its SIMD twelve stage design, it can do almost anything, so long as the 12 stages can accomplish it. The R600 is, right now, quite a mystery card, with some massive numbers confirmed. A 512bit memory bus (ATI dubbed: Ring Bus)? over 150gb/s of memory bandwidth? Are you insane? But the number 64 keeps coming up, and it would appear the R600 has 64 Unified shaders. There clock and arrangement is still not known to me (or you guys, if you do know something please post it, but i try to keep up-to-date on this sort of stuff). So who's better? Is the R600 worth waiting for?

Ring bus refers to their internal bus, the bi-direction 256bit paths, it also has a 512 external bus, as well as internal (key feature to have both), memory is way up in the air, I doubt they'd release it with GDDR4 1.0 (2.0 effective) rather up it to the 1.2, or1.25/1.3, if they want to get some serious advantages out of GDDR4 they better start using better stuff that wasn't tested over a year ago.

The shaders are still pretty uhh.. inexistant, we have no idea how they plan on going with their unified system, they've already proved it with the XENOS, we know they can do it, just how they are going to achive; single core clock or independant Shader clock/core clocks? how many ALU's are they going to be pushing per line? how are they going to split load balancing? also if its going to be in any form massive how are they going to handle scheduling? not to mention physic's seem to be becoming a huge commodity in the masses right now how are they going to feed us with this just incase we need/want it? so many questions, no answers (which is fine its not released yet!).

But I thought ATI worked closely with MS with DX10? More closely that nVidia. If anything, wouldn't ATI work better with Vista? That's just what I heard, I dunno. Something to do with the X360.

this is false and its getting iritating of hearing it all over the place

ATI was contracted by MS for a skew of a GPU they could use in the Xbox360, they told ATI what they wanted it for and what they wanted it to do, ATI made it, sold it to MS thats their dealing with the Xbox360.

Microsoft is working with ALL companies to ensure max compatability, if it doesn't fit their tight standards then it won't be a D3D10 compatable component, they have nothing to gain by only working with one company, its up to that company to make a product work by their standards not Microsofts.
 
Just to clarify a few things. Being able to do physics calculations isn't really something that needs to be added in. Cards have been capable of that for some time. For instance all X1k series of ATI cards can perform them with fairly good efficiency. The 8800 series can do the same thing. The 7900 series was typically left out because dynamic branching performance was fairly abysmal.

Also the ring bus isn't really a trademark so much as a description of what it's doing. They essentially form rings around the entire chip with a bus going in each direction. For R580 they have a 256bit external memory bus with 2x256bit(512bit internal) bus. R600 should be 512bit external with 2x512bit internal.

Also Samsung apparently stopped selling 1GHz GDDR4, which doesn't mean ATI won't downlclock to those speeds, so they will likely use at least 1.1GHz chips if not up to 1.4GHz.

In regards to the AMD(ATI) and MS relationship it was an openly developed spec but DX10 does resemble how the XBox360 was setup so ATI potentially has some existing experience with working with DX10. Also seeing what worked and what didn't with Xenos could have influenced the direction MS went with DX10. So I'd imagine it's entirely possible DX10 was pushed in a direction favorable to ATI. So figure out well what worked well with the xbox/xenos and improve it for DX10.
 
Just to clarify a few things. Being able to do physics calculations isn't really something that needs to be added in. Cards have been capable of that for some time. For instance all X1k series of ATI cards can perform them with fairly good efficiency. The 8800 series can do the same thing. The 7900 series was typically left out because dynamic branching performance was fairly abysmal.

Also the ring bus isn't really a trademark so much as a description of what it's doing. They essentially form rings around the entire chip with a bus going in each direction. For R580 they have a 256bit external memory bus with 2x256bit(512bit internal) bus. R600 should be 512bit external with 2x512bit internal.

Also Samsung apparently stopped selling 1GHz GDDR4, which doesn't mean ATI won't downlclock to those speeds, so they will likely use at least 1.1GHz chips if not up to 1.4GHz.

In regards to the AMD(ATI) and MS relationship it was an openly developed spec but DX10 does resemble how the XBox360 was setup so ATI potentially has some existing experience with working with DX10. Also seeing what worked and what didn't with Xenos could have influenced the direction MS went with DX10. So I'd imagine it's entirely possible DX10 was pushed in a direction favorable to ATI. So figure out well what worked well with the xbox/xenos and improve it for DX10.

Card physics isn't new, I was wondering how the card would handle it, if it could number crunch and render at the same time, I believe that is what developers want, not a new card added in for dedicated physics, who wants to purchase another pricey card just to do physics much like the PhysX add on. If people want it, or think they need it, the better solution would be an all in one addition.

The internal workings of the ring bus isn't to have a bus to suround the chip but rather to move data in both directions simultaneously, this removes alot of bottlenecks and more efficient framework, how ever it does use more juice but nothing to complain about. As far as what the R600 will have, no one knows what the internal would be, the bidirectional 256 paths seemed to work rather well removing any bottlenecks internal we had now, who knows if ATI wanted to include more transisters to move to a 2x512 internally (the 2x256 would work just fine with the 512 external)

ATI won't downclock, for one they've never been known to downclock, not to mention they are trying to make up for lost ground, not like their sales on current cards are bad they just want top dog spot again, much like they are used to for the past 3 series, 1.4 isn't too far of a reach for them but it will be over kill and it will push their retail sales higher then what they will want, samsung has GDDR4 way higher but it does come at a price which is why i speculated the 1.2-1.3 (this could also mean ATI has a stock pile of 1.0 GDDR4 in, and want to use that, 1gig is nothing to sny at, thats alot of modules)

the D3D10 spec was completed along time ago, they did release a 10.1 spec, but that doesn't have anything new from D3D10.0, its just stricter guidlines, the dealings with ATI wasn't for DX10 compatability (I don't think the OS on the Xbox360 has the ability to run any D3D10 spec) but more along the lines of something that was better then what was out. ATI having the first real chance to do something involving stuff required in DX10 definetly gave them the leg up, but that doesn't mean Microsoft gave them anymore special attention then Nvidia, what they decided to push, and leave out in regards to the 360 in my opinion would be a waste seeing as the GPU wouldn't be used in that regards but more along the lines of a really good SM3.0 chip
 
I know the X1K series is capable of physics at the same time as rendering graphics. I'd imagine G80 is the same but I don't have any experience with it. Performance wise the second card would always be better and taking an old card and using it for physics would seem to be an ideal method for expanded physics capability. That's not to say loading up on a bunch of high end cards isn't possible.

While the ring bus doesn't necessarily need to circle the core, there's no reason why not to since the start/end of it are connected. I also believe there were some pictures showing the bus clearly circling the chip with 4 stops in the loop. That's also how ATI explained it. And for a 512bit external bus 1024bit internal would almost be a requirement to avoid saturating it. Plus if you're fetching in 512bits ideally you could keep forwarding them at the same pace.

Also 1.1GHz, 1.2GHz, and 1.4GHz GDDR4 chips are the only ones Samsung has listed so I'm going off of those people the possible options. And I have seen 1950's with 1.1GHz chips before and I'm fairly sure they weren't running 1.1GHz memory speeds so they have downclocked chips before. I'd say 1.2GHz is the most likely target. It really depends on just how much more bandwidth those chips are needing.

As for making up lost ground so far they don't appear to have lost to much. AMD gained marketshare while Nvidia lost marketshare last quarter so they aren't necessarily hurting that much. Just going off the bandwidth numbers we could be seeing a fairly large jump in performance so the ground might already be made up.
 
From my understanding of the R520/580 series of GPU's you would still need a seperate card, not currently in CF or rendering to be dedicated for physics, its been a while since I've read anything on it.

Heh yea, I knew it circled the chip, just saying it doesn't have to, irregardless considering chip forming you would think having everything symetrical would help out with the design, also for some reason I'd like to see a 4x256 path instead of 2x512, that leaves extra head room to play with but also involves developing a much more complicated core, I'd like to see memory path ways used for dedicated data processing for some reason that would be awsome.

Ram sold, and Ram tested are different, I ment what ATI would sell them at, not what Samsung is selling, much like how the GDDR3 at 1.1ns could reach 1.0 in few cases but was classified at 900, even then alot of this memory ended up failing even at that spec, not really sure where I'm going with this, working a mid with 3 hours of sleep sucks especially trying to have a good conversation.

By lost ground I ment in sales, not performance, the X1950 is doing really good right now but their next gen is still a ways off while Nvidia is sitting happy right now being able to market it as the first DX10 card released, it also seems they are going to beat them with mid to low range too

edit: after looking at samsung's page, it looks like you can still order 1.1 (or the 090), did they announce they stopped producing it?
 
I think they announced they will stop producing 1.0GHz.

For the ATI cards they just process commands as they come in. So the cards really don't care if the commands are saying do graphics or do physics. It's all the same to the card. There aren't actually any physics APIs out there to make it try, but examples using CTM seem to work from what I've seen.
 
By lost ground I ment in sales, not performance, the X1950 is doing really good right now but their next gen is still a ways off while Nvidia is sitting happy right now being able to market it as the first DX10 card released, it also seems they are going to beat them with mid to low range too


Do you have sales figures for G80? Cause, and this is not to bash nVidia, I think the money they're making from the 8800GTX/GTS is actually quite small when you compare it to the sheer size of a company like nVidia. I don't think the sheer performance crown for 1-2 enthusiasts cards actually helps the company significantly financially. It gets them 'forum-cred' for being the e-penis card of choice, but I don't think it does much more. But I'm not just saying that for nVidia, same goes for ATI. Even if R600 1337 hax0rz pwnz G80, it's not going to make or break ATI.

I read an article saying AMD is actually doing EXTREMELY well in terms of market penetration. You would think that the opposite would be true, since Conroe took the crown. But when you think that countries like Venezuela can only afford the cheapest and most integrated systems, you start to see the picture (which is a market AMD wants since they're concentrating on integrated and cheap graphics with ATI). These companies don't just sell to first world nations. My Armani wearing, Maserati driving, ski vacationing lifestyle is not representative of the majority of customers, and I realize this. Whereas we buy one card, or gawd knows maybe even two, some company is buying 100 or more FireGL/Quadro cards for their visualization Linux boxes. THAT'S where the money is... workstation and low low end.

I know all nVidia zealots are gonna jump on me for what I said, cause they want to believe the 8800 is really hurting ATI. But I'm not saying this to short EITHER company, just saying that this 'performance crown' means much less to ATI/nVidia, than it does to us enthusiasts. So to relate corporate performance and our own perceptions of who's 'ontop' is usually quite out of sync because of this. That's all I'm saying. Okay... ready... set... flame me! :D
 
Do you have sales figures for G80? Cause, and this is not to bash nVidia, I think the money they're making from the 8800GTX/GTS is actually quite small when you compare it to the sheer size of a company like nVidia. I don't think the sheer performance crown for 1-2 enthusiasts cards actually helps the company significantly financially. It gets them 'forum-cred' for being the e-penis card of choice, but I don't think it does much more. But I'm not just saying that for nVidia, same goes for ATI. Even if R600 1337 hax0rz pwnz G80, it's not going to make or break ATI.

I read an article saying AMD is actually doing EXTREMELY well in terms of market penetration. You would think that the opposite would be true, since Conroe took the crown. But when you think that countries like Venezuela can only afford the cheapest and most integrated systems, you start to see the picture (which is a market AMD wants since they're concentrating on integrated and cheap graphics with ATI). These companies don't just sell to first world nations. My Armani wearing, Maserati driving, ski vacationing lifestyle is not representative of the majority of customers, and I realize this. Whereas we buy one card, or gawd knows maybe even two, some company is buying 100 or more FireGL/Quadro cards for their visualization Linux boxes. THAT'S where the money is... workstation and low low end.

I know all nVidia zealots are gonna jump on me for what I said, cause they want to believe the 8800 is really hurting ATI. But I'm not saying this to short EITHER company, just saying that this 'performance crown' means much less to ATI/nVidia, than it does to us enthusiasts. So to relate corporate performance and our own perceptions of who's 'ontop' is usually quite out of sync because of this. That's all I'm saying. Okay... ready... set... flame me! :D

I don't need to be a NVIDIA zealot to tell you that NVIDIA is hurting ATI in the high-end market. That's where the released 8800 cards settle and ATI has absolutely nothing to compete with them.
However, it's not the high-end market that dictates who's "winning" or "losing" in what sales are concerned. That's for the mid-low range. And according to recent news, NVIDIA should release their mid-low range of G80, by the time ATI "paper-launches" R600, in March.
 
I don't need to be a NVIDIA zealot to tell you that NVIDIA is hurting ATI in the high-end market. That's where the released 8800 cards settle and ATI has absolutely nothing to compete with them.
However, it's not the high-end market that dictates who's "winning" or "losing" in what sales are concerned. That's for the mid-low range. And according to recent news, NVIDIA should release their mid-low range of G80, by the time ATI "paper-launches" R600, in March.

Beating ATI, but not hurting them.
Unless you have sales figures and financials to prove your statement.
 
I don't need to be a NVIDIA zealot to tell you that NVIDIA is hurting ATI in the high-end market. That's where the released 8800 cards settle and ATI has absolutely nothing to compete with them.
However, it's not the high-end market that dictates who's "winning" or "losing" in what sales are concerned. That's for the mid-low range. And according to recent news, NVIDIA should release their mid-low range of G80, by the time ATI "paper-launches" R600, in March.

I thought it was actually the lowest end of the market that generated the most revenue, i.e., integrated systems and such? ATI has a big share of this market, and that's what fills their coffers. I always thought the visualization for the income structure looked like a pyramid, where the low-end products make the most money at the bottom and the high-end products make the least at the top.
 
I thought it was actually the lowest end of the market that generated the most revenue, i.e., integrated systems and such? ATI has a big share of this market, and that's what fills their coffers. I always thought the visualization for the income structure looked like a pyramid, where the low-end products make the most money at the bottom and the high-end products make the least at the top.

And you are right, but I was strickly talking about the NVIDIA and ATI competition.
 
just bc the high end cards don't produce the most money, they project an image. faster cards makes the company appear to have a better range... having the best performance card is a marketing thing as well as a money thing =)
 
just bc the high end cards don't produce the most money, they project an image. faster cards makes the company appear to have a better range... having the best performance card is a marketing thing as well as a money thing =)
Not to mention that pretty much every card released in the next 2 years will be based on that high end card.
 
Back
Top