Sorry, no RTX2080 performance for $300 on Navi

Status
Not open for further replies.
I care. Less power comsumption means less heat, means cheaper PSU, more overclocking headroom while running cooler. And saving a few pesos on the powerbill.

With a cpu I might care a bit more since I overclock but with the gpu sitting idle more often then not I could care less, even my 290x didnt bother me nor did I notice really any difference in the power bill. Maybe I just dont game as much as some of you.
 
Unless your running multiple GPUs or playing the SFX game, power consumption of these little things is really not a big issue to me. Performance and price are. Unfortunately high power also usually means big coolers, which are expensive.
I won't be even considering Navi unless it's priced as a 580 replacement with Vega performance.
 
Well, HD7970 was pretty good... until NV released GTX 680 that by chip designation was intended to be GTX 660
Never heard that before, source please?

290X was also actually pretty good performance as well but definitely too hot. Should’ve been the end of GCN then (or maybe the 480, if 390 hadn’t happened)

Regardless of when GCN should’ve been retired though, it is impressive in a way how far they brute forced the design to perform.
 
Never heard that before, source please?

290X was also actually pretty good performance as well but definitely too hot. Should’ve been the end of GCN then (or maybe the 480, if 390 hadn’t happened)

Regardless of when GCN should’ve been retired though, it is impressive in a way how far they brute forced the design to perform.
GCN was an amazing design. I used my 7870 for years. It's saddening that AMD didn't create a new arch, but its done pretty well all things considered. It's been said before but hopefully the real effort is being put into nextgen so they pull a zen moment in the graphics space.
Imo Navi is just to pass the time. I personally want Vega performance at 480 launch price, but if not oh well.
 
You guys crack me up, nowhere in history has a engineer come in and said, " Hey this tech is working pretty darn good but not quite as good as the competition. So Instead of tweaking it we should throw it in the garbage and build a new one from the ground up and just hope it works better...".
 
  • Like
Reactions: otg
like this
I'm looking to upgrade my two Maxwells to a single card that can beat them both in a best case scenario, but the days of spending thousands on a graphics solution are behind me. I'd like to see what AMD can do for $650, so far I'm not impressed.
 
For power consumption I care more about the heat being pumped into the room as I don't have any AC

Even if you do- it can still suck. Some of the power / performance deltas on AMD GPUs versus the Nvidia ones have been atrocious and I'd happily pay another US$50-US$100 to not dump another 50w or 100w into the room.
 
You guys crack me up, nowhere in history has a engineer come in and said, " Hey this tech is working pretty darn good but not quite as good as the competition. So Instead of tweaking it we should throw it in the garbage and build a new one from the ground up and just hope it works better...".
GCN hasn't been ideal for more than a couple years. It's obvious today that rather than put their eggs into the hbm basket AMD should have been working on nextgen. They've been throwing cores at an architecture that appears to have hit it's most effecient #of cores long ago.
Last couple releases have been new silicon that barely outperforms old silicon, but at a higher price.
Industry needs a 4870 moment, as a customer anyway. I'm pretty sure Nvidia is happy with the new reality of GPU pricing.
 
I care. Less power comsumption means less heat, means cheaper PSU, more overclocking headroom while running cooler. And saving a few pesos on the powerbill.
So true. I care not much about the power bill because it probably won't budge much. But a card that's hitting 85c at stock is not going to have a ton of headroom. And is a space heater sitting on/under a desk.
 
GCN hasn't been ideal for more than a couple years. It's obvious today that rather than put their eggs into the hbm basket AMD should have been working on nextgen. They've been throwing cores at an architecture that appears to have hit it's most effecient #of cores long ago.
Last couple releases have been new silicon that barely outperforms old silicon, but at a higher price.

It's worth pointing out that the deficiencies perceived in their architecture differ in terms of use case.

AMD appears to have focused far more effort on compute than on gaming. Perhaps at some point they felt that they had to pick one over the other, but the general result has been the successive release of products that cannot compete at any level across the board with Nvidia's gaming-focused line. And that's after they mostly got their dodgy drivers sorted.

But for compute? In many usescases they've been great.

Industry needs a 4870 moment, as a customer anyway. I'm pretty sure Nvidia is happy with the new reality of GPU pricing.

Big problem is that GPUs have only gotten more difficult to manufacture. Big silicon on leading-edge processes with high memory bandwidth and power requirements? Oh, and multi-GPU development has gotten an order of magnitude more difficult with the advent of lower-level APIs like DX12 and Vulkan, and must be done at the game / game engine level and thus has been largely ignored by developers these last few years, so making smaller gaming-focused GPUs and bonding them together (HD4870x2 and those that followed) isn't a legitimate high-end strategy?

AMD has been simultaneously digging themselves out of the holes they created with Bulldozer and the 2900XT.

Big Navi will be the first gaming-focused high-end GPU under AMD stewardship, and Navi is just another step toward actually being competitive across the board for high-end gaming.
 
So true. I care not much about the power bill because it probably won't budge much. But a card that's hitting 85c at stock is not going to have a ton of headroom. And is a space heater sitting on/under a desk.

Reported temperature means... very little. That's why we look closely at TDP for the best measure of how much actual heat is being produced. Temperature is only good for comparing the performance of coolers for the same base GPU.
 
As a 1080 owner, what reason was there to get a 2070 again? The meh dlss support that you just have to "believe real hard" look better thsn resolution upscaling or the slideshow rtx effects? I forget which.

Anf saying R7 sits between 2070 and 2080 is like saying V56 sits between 1070 and 1080. Technically true, but it's a lot closer to the latter than the former.
Isn’t it pretty standard not to buy a GFX every generation but instead skip every other?
If you were rocking a 970 a 2070 is killer.
If you are kitted out in r480 then v7 56 is a nice upgrade.

As far as Navi goes, if small Navi trades benchmark wins with a 2070 then it should be price 25-50 dollars cheaper due to lack of DXR support. While DXR is not widespread now it should become baked into the next round of AAA games as the engines gain support.
 
Honestly, I don't have any problem picking up a midrange or even entry level GPU for some purposes, but I won't be picking up a primary gaming GPU without hardware DXR support.

No idea when AMD is going to get around to doing that. No idea if they'll even be competitive when they do.
 
Enthusiast PSU...?!?

Let me introduce you to the HB3000, 3000W of 80+ Titanium rated power in a SFX-L form factor and connected to a three-phase main...!

What's the HB stand for you ask...? Honey Badger, because no matter what HCC CPU & quad GPUs you throw at it, Honey Badger don't care...!
 
But for compute? In many usescases they've been great.

This ^

Content creation workflows can benefit from the compute power...

Big Navi will be the first gaming-focused high-end GPU under AMD stewardship, and Navi is just another step toward actually being competitive across the board for high-end gaming.

Big Navi should also be the last of GCN, hopefully what comes after will be a Zen moment for AMD in regards to GPUs...!
 
2070s are dropping to 479ish and the MSI Duke has a 30 dollar rebate bring it to 449 if you jump through those hoops. If Navi somehow is faster than a 2070 it may very well be 499 at launch. If AMD wants to push pricing then it has to go to like 399-449 right away. It will be interesting to see how it shakes out.

Bringing Vega56 performance down to under 250 would be exciting for sure but probably not happening.

Like others have mentioned-- If you are already on a Vega 64, 1080, or even 2060? or better.. No upgrade on the AMD side other than Radeon 7 still.
 
Never heard that before, source please?
This is a rumor/guess but many people suspected GK104 was intended to be something like GTX 660/670 given "4" ending and its small size. Nvidia used to produce huge chips just to have performance crown and did make GK100 which then got used only in high end professional cards and by all means they would use it as GTX 680 if they had to. This was at the time we were really getting some serious performance gains each generation mind you.

Ultimately Nvidia tweaked design of GK100 and made GTX780 series with it as GK110 in the next generation.

290X was also actually pretty good performance as well but definitely too hot. Should’ve been the end of GCN then (or maybe the 480, if 390 hadn’t happened)
290X was pretty nice card. At this point it was good decision to go with GCN
Next gen Fury was maxing out architecture to its very limit and it was not bad GPU either for the time but imho decision to go with HBM was very wrong. If they instead of 4GB which people complained a lot went for something like 512bit 8GB GDDR5 they would see much better sales and recognition. As it was people saw 4GB and were like "this is gonna limit me" and got GTX980Ti even if for all intents and purposes 4GB was enough even for 4K for quite some time.

All these cards had worse TDP ratings than Nvidia counterparts but it was still within acceptable margins. What was overdoing it was concentrating efforts on reviving Fury as Vega at the point when Nvidia had Pascal. Not only Vega 64 consumes 100W more than 1080 but from design standpoint despite 12.5M transistors vs. 8.9M transistors in Fury XT performance improvements were non-existent. It should be obvious this design was at its limits in Fury and should be already replaced. And here we go again with another maxed out GCN card... I really doubt they found any way to scale it past Fury because if they did they would go for >Radeon VII performance and not <Radeon VII. So again 4096 and 64 ROP limit, just changed memory controller and he same L2 (increasing it last time did very little other than increasing transistor count so why bother) and higher clock to get to 2070 performance levels which given Vega 64 is GTX 1080 performance level this should not be hard at 7nm XD

Regardless of when GCN should’ve been retired though, it is impressive in a way how far they brute forced the design to perform.
One reason I can find for still bothering with GCN are next-get consoles for which GCN would be preferred because of backwards compatibility. It could be done with different GPU, especially with some clever trickery but using GCN again will make everything pretty simple and more compatible. Not that PC market needs Navi to be rolled out because of consoles but it would be pretty strange if AMD had next architecture and consoles used older (like it was with Playstation3 not using G8x derivate...) and AMD most probably is still developing its next-gen architecture. Also console GPUs probably took a lot of engineers effort to develop and with consoles being high volume market it is understandable AMD would care to do it right.
 
Gcn isn't a deal breaker, it just has a lot of baggage to conform to, a new ground up arch would allow more freedom.
 
This is a rumor/guess but many people suspected GK104 was intended to be something like GTX 660/670 given "4" ending and its small size. Nvidia used to produce huge chips just to have performance crown and did make GK100 which then got used only in high end professional cards and by all means they would use it as GTX 680 if they had to. This was at the time we were really getting some serious performance gains each generation mind you.

Ultimately Nvidia tweaked design of GK100 and made GTX780 series with it as GK110 in the next generation.

Fun part is: they made the GTX670 and GTX680 with their 'mid-range' dies, something that they've continued to do since, and have competed well with AMD's larger compute-focused dies in gaming. And by competed I mean beaten across the board in every metric nearly every generation.

I went from an HD6950 Crossfire setup to a single GTX670, which while slower on paper, was significantly faster in use. I then added as second for SLI, which at the time, ran extremely well. That was the last time I ran a higher-end AMD GPU; from my perspective, every generation since has been more or less the same in terms of performance, power usage, noise, etc. +/- US$50 differences are not an issue when you get into higher end cards, especially when you're keeping them for a few years.

Gcn isn't a deal breaker, it just has a lot of baggage to conform to, a new ground up arch would allow more freedom.

It really isn't. Had AMD put out a GPU with the version of GCN used on Polaris, but with two or three times the shader processors and accompanying scaling of componentry and memory bandwidth, they'd have been competing with Nvidia's top end.

My best guess is that AMD's lack of higher-end gaming-focused GPUs has been a business choice more than a technical one. I fully believe that they could do it if they chose to.
 
Fun part is: they made the GTX670 and GTX680 with their 'mid-range' dies, something that they've continued to do since, and have competed well with AMD's larger compute-focused dies in gaming. And by competed I mean beaten across the board in every metric nearly every generation.
Not quite. For example Fury X was pretty competitive to GTX 980Ti which was high-end part
perfrel_3840.gif

It was still worse card, especially at lower resolutions (mostly sign of better job on Nvidia driver team) and with higher power consumption (not by much though) but nothing to complain about. 4GB memory was its biggest flaw even if this was marketing thing for the most part as games ran on this card just fine until it naturally got obsolete (as far as high-end gaming goes, not so much in general sense)

Radeon 290X was comparable with first Titan and not much slower than overclocked 780Ti - both high end chips
perfrel_2560.gif

It similar story with power consumption and again nothing tragic

They should however drop GCN with generation after Kepler/Fiji because Pascal destroyed Vega 64 which itself was Fury redo and nothing more and in this generation they did compete with "mid-range" GTX 1080 with way too much power consumption difference.

I went from an HD6950 Crossfire setup to a single GTX670, which while slower on paper, was significantly faster in use. I then added as second for SLI, which at the time, ran extremely well. That was the last time I ran a higher-end AMD GPU; from my perspective, every generation since has been more or less the same in terms of performance, power usage, noise, etc. +/- US$50 differences are not an issue when you get into higher end cards, especially when you're keeping them for a few years.
HD40x0 were pretty good and especially HD50x0 was totally awesome cards. HD60x0 seemed like step backwards with questionable design choices like moving from VLIW5 to VLIW4 and some picture quality degradation hacks (AF looked worse) but still decent cards compared to Fermi+
HD30x0 were good also for the price though at the time really only cards to get were GTX 8(9)800 series. I had HD3850 512MB which I quickly replaced with 8800GS 386MB because the latter was cheaper and simply better card with zero issues.

It really isn't. Had AMD put out a GPU with the version of GCN used on Polaris, but with two or three times the shader processors and accompanying scaling of componentry and memory bandwidth, they'd have been competing with Nvidia's top end.
My best guess is that AMD's lack of higher-end gaming-focused GPUs has been a business choice more than a technical one. I fully believe that they could do it if they chose to.
If this GPU was designed for consoles it makes a lot of sense that it should be just that...
 
...not really. Also, while I don't consider bare images as authoritative, even they contradict your point. Consider the R9 290X Uber- we're talking massive power draw, heat output, and noise increases. AMD's definition of 'quiet' in terms of GPUs really just isn't.

Need to consider release dates as well, and need to provide links. No single site or run of benchmarks is 'the answer'.
 
It wouldn't benefit them to anyways. The people in these forums and elsewhere that went ahead and bought the RTX line are to blame for all this. You gave them this inch. Now we're at a new price level for gfx card
 
Last edited:
It wouldn't benefit them to anyways. The people in these forums and elsewhere that went ahead and bought the RTX line are to blame for all this. You gave them this inch. Now we're at a new level for gfx card

They bought Vegas and Radeon VIIs too. AMD and Nvidia (and Intel and...) are free to price as they wish.
 
They bought Vegas and Radeon VIIs too. AMD and Nvidia (and Intel and...) are free to price as they wish.

Everything starts with Nvidia's pricing and the fanboys or whatever it is that will buy it regardless. I could afford to buy two 2080Tis as I need it for 4K 3D and not give a fuck but I don't out of principle. The pricing level set is ridiculous and the consumers year in year out enable it. You have people spending close to 4000 bucks on SLI 2080 TIs, what kind of message are you sending to them. Personally I hope they crash and burn. My favorite tech hobby is getting too greedy
 
Everything starts with Nvidia's pricing and the fanboys or whatever it is that will buy it regardless. I could afford to buy two 2080Tis as I need it for 4K 3D and not give a fuck but I don't out of principle. The pricing level set is ridiculous and the consumers year in year out enable it. You have people spending close to 4000 bucks on SLI 2080 TIs, what kind of message are you sending to them. Personally I hope they crash and burn. My favorite tech hobby is getting too greedy

As much as I personally agree with the value/cost being out of whack FOR ME, I think it is either disingenuous or naive to deride people buying the 2080tis as fanboys or even foolish. They obviously want the best, and have the money. When you make a top end part, you get to ride that nonlinear cost/perf curve. <insert high end car analogy here> That's the reward for a halo product. That's why all companies want that catbird position.

Of course it is greedy. It's a corporation, that's the point - make as much money as they can. If they are selling 2080ti in acceptable volume, then by definition it is priced correctly - whether or not I like it.
 
I only see one on NE for $999. They are still normal price of between $1200 and $1400.
Conversely I also only see one for $2000, the EVGA Kingpin.
Between Amazon and Newegg there have been many $999 models available, some sales even hit as low as 900 such as the Newegg via ebay sale over the holidays. These cards perform within a couple of percent oc to oc or stock to stock... The kingpin card is for ln2 benchmarking pro enthusiasts, not gaming.
 
Everything starts with Nvidia's pricing and the fanboys or whatever it is that will buy it regardless. I could afford to buy two 2080Tis as I need it for 4K 3D and not give a fuck but I don't out of principle. The pricing level set is ridiculous and the consumers year in year out enable it. You have people spending close to 4000 bucks on SLI 2080 TIs, what kind of message are you sending to them. Personally I hope they crash and burn. My favorite tech hobby is getting too greedy

If you're doing something out of principle here, then you are the self-described 'fan boy'.

I want the best product for the job, and yes, that qualification does reach beyond just raw performance, and yes, that's meant that I've mostly bought Nvidia in the last decade, and no, that doesn't make me an 'Nvidia fan boy'. Do note the AMD GPU I'm using in my main system.
 
2070s are dropping to 479ish and the MSI Duke has a 30 dollar rebate bring it to 449 if you jump through those hoops. If Navi somehow is faster than a 2070 it may very well be 499 at launch. If AMD wants to push pricing then it has to go to like 399-449 right away. It will be interesting to see how it shakes out.

Bringing Vega56 performance down to under 250 would be exciting for sure but probably not happening.

Like others have mentioned-- If you are already on a Vega 64, 1080, or even 2060? or better.. No upgrade on the AMD side other than Radeon 7 still.

price is only dropping so they can slot in the 2070ti and 2060ti when navi releases which will only get a memory refresh but same gpu.

Between Amazon and Newegg there have been many $999 models available, some sales even hit as low as 900 such as the Newegg via ebay sale over the holidays. These cards perform within a couple of percent oc to oc or stock to stock... The kingpin card is for ln2 benchmarking pro enthusiasts, not gaming.

i'd be surprised if the AIB's continue to sell those reference clock cards since nvidia no longer going to sell those "non binned" gpu's.. they'll only be selling the A1 chips that come with the FE..
 
Last edited:
price is only dropping so they can slot in the 2070ti and 2060ti when navi releases which will only get a memory refresh but same gpu.



i'd be surprised if the AIB's continue to sell those reference clock cards since nvidia no longer going to sell those "non binned" gpu's.. they'll only be selling the A1 chips that come with the FE..

Source?
 
After what I said about power consumption, I won an auction for an Asus Strix Vega 56 lol. £200 was too good to pass up. Likely half the cost of the Navi Pro or whatever it will be called.
 
Never heard that before, source please?

290X was also actually pretty good performance as well but definitely too hot. Should’ve been the end of GCN then (or maybe the 480, if 390 hadn’t happened)

Regardless of when GCN should’ve been retired though, it is impressive in a way how far they brute forced the design to perform.

Look at the code name for GTX680, the die size and the memory bus.
This should be common knowlegde, but bias,make people deny it...
 
Look at the code name for GTX680, the die size and the memory bus.
This should be common knowlegde, but bias,make people deny it...

So conjecture, got ya.

Note, I’m fine with the conjecture and can even agree with it. But it’s still conjecture and should be treated as such.
 
Status
Not open for further replies.
Back
Top