Navi Rumors

Status
Not open for further replies.
I'm not being funny here, but have you missed a word or two out from the bolded part? Because if you meant "the best value card" then I could agree with you, but if you mean "the best performing card" then that's obviously wrong when we're talking about the 2080Ti or the previous-generation top-end cards.

The problem here is you're assuming what "best" means. The best card(s) are different for each people. Best can imply value, highest framerate at a given game, most memory... I understand what Pieter3dnow is saying. With Nvidia you do pay for the best marketed card, that's a fact you can't argue against - their marketing has instilled Geforce into the brains of millions of people, even when their products plain sucked (I don't know how old you are, but you might want to check their FX series and the 400 series). Yet people still bought them like hotcakes, though in those situations AMD's obviously way superior product did regain a bunch of ground. Even then, Nvidia has been dominant for a long time, despite some bad generations of products. AMD is now so weak. People say they're not performing, but aren't they? Does performing mean winning in the $700+ market? Because last time I checked, the great majority of GPUs out there are $150 to $300 territory. The RX480 was better value than the 1060 (I should know, I exchanged the 1st for the 2nd one, only because thanks to the mining explosion I bought my RX480 for $200 and sold it for $350, then bought a cheapie 1060 to keep gaming while making a nice profit... I can assure you the 480 performed way better than my 1060, but hey, I gained $150 for little effort). Likewise with the RX570s now, which are cheap as dirt, and yet people keep buying the 1060 or worse yet, the absolute trash 1050s!

So yeah, marketing definitely has a big impact, the concept of value has shifted because of it in the minds of many consumers these days, and "best" means many different things.
 
The problem here is you're assuming what "best" means. The best card(s) are different for each people.
I'm not assuming anything, that's why I'm asking the question :)

And you're completely right that "best" depends on what metrics you're using. Equally, however, doesn't that make the statement "With Nvidia you will never buy the best card" invalid without defining what best means? That's why I was asking for clarification.
 
Will i think Nvidia are becoming more and more Apple like, and i don't like that.
Just cuz you have a good product don't mean you then also know what all people in the market for such a product want, i think thats very arrogant.

I am on a 1080p screen, and i want to buy a GFX card that do that really good, and down the line if i should go 1440P or 4K can also do that okay if i relax GFX quality settings in a game.
But the latter part are probably so long down the line it would make more sense to also buy a new GFX again at that time.

So ! i could already now get what i need without too much work or spending, but as i am on a new computer build i would also like the GFX to be on a new architecture.
So i am hoping NAVI will lift a little what are generally said to be mid range for GFX cards, cuz then i think it will fall well in line with my most wishful thinking about a new GFX card, that granted might be overkill, but i have never been known for under-doing anything.

Just want "the best" my money can buy me.
 
AMD-Navi-GPU-Radeon-RX-Graphics-Card-PCB_1-e1556359117299-740x555.jpg


https://wccftech.com/amd-navi-gpu-radeon-rx-graphics-card-pcb-leak-gddr6-vram/
 
Many words on a page saying absolutely nothing , classic wccftech :).
Expect to hear more about the upcoming Navi GPUs at Computex 2019 on 27th of May where CEO Lisa Su will be presenting the opening keynote and announcing their next-gen products.

This sentence is funny ;) because how can you not hear more if you heard absolutely nothing from this article :)
 
I basically ignored WCCFTech's post. I'm always baffled at how much screen space they can occupy without saying anything. And once they spend that amount of text appearing like they know something, they regurgitate the past 12 months of rumors for the umpteenth time, making it seem like a very, very long article that you must read! I got tired of them long ago.

This guy on Youtube does a pretty decent and entertaining analysis/speculation of the Navi PCB though (I learned a bunch of things about GPU PCBs that I didn't know before!):

 
God, I hope they don't go with a blower version. Please AMD, just no. If this card has 2 8 pins, then that would really be insane. I'd be more than happy with a heatsink the likees of the Vega 7. I really want to update my sons 480 with Navi, but not with a blower card.
 
God, I hope they don't go with a blower version. Please AMD, just no. If this card has 2 8 pins, then that would really be insane. I'd be more than happy with a heatsink the likees of the Vega 7. I really want to update my sons 480 with Navi, but not with a blower card.

You are putting things together where I would have my reservations :).
If you take some of the leaks and put the amount of watts next to it, do you think that at 150 Watt you need 2 8 pin power connectors ? So at best this might be the board for Navi 20 which was rated at 225 Watt (?) and beyond 2019 Q3 release?

Navi_leaked.jpg

So a blower on lower parts is pretty debatable (the problem with the blower is when you hit the peak watts and produce a lot of heat then the blower goes crazy) you would likely not hit those peak temperatures. And to be honest you get so much more value for your money not buying the blower edition on stuff as Vega56/64 shows that and by now if you are going to buy Navi 20 you would know better...
 
Last edited:
I basically ignored WCCFTech's post. I'm always baffled at how much screen space they can occupy without saying anything. And once they spend that amount of text appearing like they know something, they regurgitate the past 12 months of rumors for the umpteenth time, making it seem like a very, very long article that you must read! I got tired of them long ago.

This guy on Youtube does a pretty decent and entertaining analysis/speculation of the Navi PCB though (I learned a bunch of things about GPU PCBs that I didn't know before!):



Actually Buildzoid explained why GDDR5 on Vega would not work and it needs HBM.
Don't forget that Fudzilla already had an article up with pricing information October last year or so detailing Navi would launch soon.
Some things still do not go smooth over at RTG (where the cpu seems to not have these hiccups).
 
You are putting things together where I would have my reservations :).
If you take some of the leaks and put the amount of watts next to it, do you think that at 150 Watt you need 2 8 pin power connectors ? So at best this might be the board for Navi 20 which was rated at 225 Watt (?) and beyond 2019 Q3 release?

View attachment 157593
So a blower on lower parts is pretty debatable (the problem with the blower is when you hit the peak watts and produce a lot of heat then the blower goes crazy) you would likely not hit those peak temperatures. And to be honest you get so much more value for your money not buying the blower edition on stuff as Vega56/64 shows that and by now if you are going to buy Navi 20 you would know better...

If you need any more nails in the coffin of AdoredTV's fake "leaks", here they are.

If the supposed "Radeon RX 3080" has a TDP of 150W, why would it have two 8 pins connectors and have VRMs on par with the reference Vega?

It wouldn't.
 
If you need any more nails in the coffin of AdoredTV's fake "leaks", here they are.

If the supposed "Radeon RX 3080" has a TDP of 150W, why would it have two 8 pins connectors and have VRMs on par with the reference Vega?

It wouldn't.
So you know for a fact that that board is Navi you know which model Navi you know everything when it comes to "disproving" AdoredTV information. Yes all you have is baseless speculation that is the title of the video made by Buildzoid.
The reason why Buildzoid says it is baseless is because no one can verify anything about the PCB. The design and layout suggests that it is GDDR6 that is about what he can say about it. Nowhere does he confirms anything you are suggesting.
Having connectors more then 1 8 pin also does not come with a bold conclusion. Better yet the 2nd 8 pin connector has no meaning it could be used as 6 pin as well.

Why don't you time stamp each and every part of the video that according you disproves what AdoredTV leaked. And I mean where Buildzoid says exactly that contradicts AdoredTV leaks.

All you have is a picture some one that suggests what it is and that someone disproves everything?
Not really.

You know why there are no other sources that emphatically contradict AdoredTV?
 
So you know for a fact that that board is Navi you know which model Navi you know everything when it comes to "disproving" AdoredTV information. Yes all you have is baseless speculation that is the title of the video made by Buildzoid.

So which video card do you think this PCB is for?

Having connectors more then 1 8 pin also does not come with a bold conclusion. Better yet the 2nd 8 pin connector has no meaning it could be used as 6 pin as well.

Why don't you time stamp each and every part of the video that according you disproves what AdoredTV leaked. And I mean where Buildzoid says exactly that contradicts AdoredTV leaks.

All you have is a picture some one that suggests what it is and that someone disproves everything?
Not really.

One 8-pins connector would be plenty for a 150W.

Radeon RX 480 has a 150W TDP and came with one 6-pins connector.

You know why there are no other sources that emphatically contradict AdoredTV?

There are. You just didn't see them because you have blinders on.
 
Having connectors more then 1 8 pin also does not come with a bold conclusion. Better yet the 2nd 8 pin connector has no meaning it could be used as 6 pin as well.
Yeah, you can't necessarily tell a GPU's power draw just from the number of PCIE power connectors. You could certainly say that if a card has two 8-pin PCIE power connectors then it's likely that it will draw more power than a card with a single 6-pin connector, for example, but it certainly doesn't mean that the card with 2 8-pins will draw the maximum 375W (75W from the PCIE slot and 2 x 150W from the two 8-pins). Also, there will typically be a fair amount of headroom in terms of the PCIE connectors, for overclocking or as a margin against spikes in power draw.

As an example, my 2070 has an 8-pin and a 6-pin PCIE connector, so in theory it could draw 300W (75W + 150W + 75W). It's actually rated for 175W (at stock, at least), which could be achieved with room to spare with just an 8-pin...
 
Yeah, you can't necessarily tell a GPU's power draw just from the number of PCIE power connectors. You could certainly say that if a card has two 8-pin PCIE power connectors then it's likely that it will draw more power than a card with a single 6-pin connector, for example, but it certainly doesn't mean that the card with 2 8-pins will draw the maximum 375W (75W from the PCIE slot and 2 x 150W from the two 8-pins). Also, there will typically be a fair amount of headroom in terms of the PCIE connectors, for overclocking or as a margin against spikes in power draw.

As an example, my 2070 has an 8-pin and a 6-pin PCIE connector, so in theory it could draw 300W (75W + 150W + 75W). It's actually rated for 175W (at stock, at least), which could be achieved with room to spare with just an 8-pin...

You are talking about an AIB model, which this definitely isn't.

GeForce RTX 2070 Founders Edition has one 8-pins connector.
 
You are talking about an AIB model, which this definitely isn't.

GeForce RTX 2070 Founders Edition has one 8-pins connector.
Regardless of whether it's an AIB model or not, my point was that actual power draw =/= maximum possible power draw from the PCIE connectors present. Two 8-pin connectors might suggest a high power draw but it's not proof of one.
 
There are. You just didn't see them because you have blinders on.

That sounds very powerful, but it's an empty statement. Proof or it didn't happen. The onus of justifying your opinion is on you, while Pieter3dnow can't really do so if he can find none :)

Yeah, you can't necessarily tell a GPU's power draw just from the number of PCIE power connectors. You could certainly say that if a card has two 8-pin PCIE power connectors then it's likely that it will draw more power than a card with a single 6-pin connector, for example, but it certainly doesn't mean that the card with 2 8-pins will draw the maximum 375W (75W from the PCIE slot and 2 x 150W from the two 8-pins).

This, exactly. And considering the RX480 launch was plagued with cards that drew too much power from the 8-pin, requiring AMD to patch it later making cards run slower, it wouldn't surprise me one bit if they're playing it overkill safe this time and designing the board with, say, two 6-pin connectors despite not using all the power (or 8-6, whatever makes more sense, I don't know).
 
I expect that there be three SKUs.

GeForce RTX 2070 performance for GeForce RTX 2060 price

GeForce RTX 2060 performance for GeForce GTX 1660 Ti price

GeForce GTX 1660 Ti performance for GeForce GTX 1660 price
 
Two 8-pin connectors might suggest a high power draw but it's not proof of one.

The PCB is likely an ES/dev board. Dev 2080 had 2x8-pin & 2080ti 3x8-pin. It also appears that the chip pad area is larger than TU104. Power draw is any one's guess. Depends on whether they aimed for high transistor density -> more dies/wafer -> lower price.

Edit: Correction.
 
Last edited:
I expect that there be three SKUs.

GeForce RTX 2070 performance for GeForce RTX 2060 price

GeForce RTX 2060 performance for GeForce GTX 1660 Ti price

GeForce GTX 1660 Ti performance for GeForce GTX 1660 price

Reasonable, The RTX 2070 will be a year old in October.
 
2070 performance will be good enough if the price is right. Being AMD, i'm sure it will be.
 
Yeah, speculation is good fun but there's no point getting ahead of ourselves trying to score points about power consumption, price or performance in the absence of reliable data. Once we see the reviews then we can decide.
 
I am ready to update my Navi predictions

________________________________

GeForce RTX 2070 performance

250W TDP

$350-$400


________________________________

GeForce RTX 2060 performance

215W TDP

$280-$300

________________________________

GeForce GTX 1660 Ti performance

180W TDP

$220-$250
 
I am ready to update my Navi predictions

________________________________

GeForce RTX 2070 performance

250W TDP

$350-$400


________________________________

GeForce RTX 2060 performance

215W TDP

$280-$300

________________________________

GeForce GTX 1660 Ti performance

180W TDP

$220-$250

If this comes true then NV will eat their lunch as usual. I could see a possible $25-50 cut in the NV products and they still heavily outsell AMD.

Not good enough.
 
I am ready to update my Navi predictions

________________________________

GeForce RTX 2070 performance

250W TDP

$350-$400


________________________________

GeForce RTX 2060 performance

215W TDP

$280-$300

________________________________

GeForce GTX 1660 Ti performance

180W TDP

$220-$250

That is hard to believe. Because if that was true then its worst than VII when it comes to per/watt. I doubt it's giving worst per/watt with the tweaks they have made and specially dedicating more resources to Navi per rumors with sony then they did with Vega.
 
It will come down to what the Market can bear right?

Are non-enthusiast gamers willing to pay $350/400 for a non RTX, 2070 equaling AMD card?
 
That is hard to believe. Because if that was true then its worst than VII when it comes to per/watt. I doubt it's giving worst per/watt with the tweaks they have made and specially dedicating more resources to Navi per rumors with sony then they did with Vega.

Radeon VII got most of its performance gain from having 1028 GB/s memory bandwidth.

You are not going to come close to that with GDDR6.

To even get Radeon VII’s performance with less than half the memory bandwidth, AMD would have to make great improvement to its delta color compression.

That’s not to mention that GDDR6 uses more power than HBM2.
 
Last edited:
It will come down to what the Market can bear right?

Are non-enthusiast gamers willing to pay $350/400 for a non RTX, 2070 equaling AMD card?

A good rtx 2060 is around 380. So I don’t see why not! Rtx is not really worth the price on 2060/2070 anyways. It’s pretty well noted people aren’t willing to pay the premium for rtx if they can have similar performance without rtx for cheap. If you have rx 3080 compete with 2070 at 2060 price. It’s not a bad deal.
 
If this comes true then NV will eat their lunch as usual. I could see a possible $25-50 cut in the NV products and they still heavily outsell AMD.

Not good enough.

STOP thinking AMD needs to lower their prices to compete with Nvidia. This is a factual lie. AMD has lowered their prices plenty of times before, and many still buy the overpriced, worse performing Nvidia cards. Case in point: the absolute bargain that RX570 has been for a number of months now, and yet people keep buying Nvidia at that tier.

Lowering prices will do nothing to help AMD. Price is not the problem, consumer culture is, fueled by marketing lies. If price won't help, AMD might as well charge as much as they can.
 
STOP thinking AMD needs to lower their prices to compete with Nvidia. This is a factual lie. AMD has lowered their prices plenty of times before, and many still buy the overpriced, worse performing Nvidia cards. Case in point: the absolute bargain that RX570 has been for a number of months now, and yet people keep buying Nvidia at that tier.

Lowering prices will do nothing to help AMD. Price is not the problem, consumer culture is, fueled by marketing lies. If price won't help, AMD might as well charge as much as they can.

…more like lack of awareness

Consumers are not going to buy it no matter how well it is priced if they don't know about it.
 
…more like lack of awareness

Consumers are not going to buy it no matter how well it is priced if they don't know about it.
Would be nice to see and actually pick up some useful marketing talent.. they seriously need to get back to doing recognizable commercials and come of with something that sticks in viewers minds like Intel does.
 
Lowering prices will do nothing to help AMD. Price is not the problem, consumer culture is, fueled by marketing lies. If price won't help, AMD might as well charge as much as they can.
I'm not sure it's marketing lies, just marketing full stop. AMD have done a piss poor job of being able to convince people that their GPU products are actually worth buying, hence their minority market share. They might not be able to compete at the very top end of the market, but that perception has trickled all the way down the product stack and AMD's efforts to counter it have been ineffective, despite them having very competitive products lower down the stack.

Again, it's up to AMD to sell their stuff better, and if there are any marketing lies (by which I'm presuming you mean lies from Nvidia?) then AMD need to be able to more effectively counter them.

However, I totally agree with you that pricing isn't AMD's problem, it's persuading potential customers to choose them over Nvidia on brand. Unfortunately in a two-way battle once one company establishes a reputation (founded or unfounded) for being "better" then it's hard to claw back that market share unless you have a product that you can convince people is superior, which is AMD's problem on the GPU side. They turned it around with Ryzen on the CPU side, by a combination of delivering a quality product at a competitive price and marketing it successfully (by pushing the "more cores" angle, even though more cores isn't necessarily what everyone actually needs - but they marketed it as such and had success because they were able to offer more cores than Intel's price-equivalent offerings).

I agree with sirmonkey1985 that AMD need to focus their attention on marketing their GPUs successfully. It obviously helps enormously if you have an excellent product, but I don't think they're currently making the most of selling the products that they currently have.
 
I'm not sure it's marketing lies, just marketing full stop. AMD have done a piss poor job of being able to convince people that their GPU products are actually worth buying, hence their minority market share. They might not be able to compete at the very top end of the market, but that perception has trickled all the way down the product stack and AMD's efforts to counter it have been ineffective, despite them having very competitive products lower down the stack.

That is actually a big part of marketing. It's called "Halo Products". Making the best, price is no object product, catches a lot of attention. While most people won't pay that, they still think of that company when they go to buy.

Also look at every CPU review, they want to use the top GPU, so they aren't bottle-necking. So most CPU reviews are using 2080ti. More reinforcement that NVidia is best.

Bottom line is that not competing at the top end hurts their marketing a fair bit.

Beyond that, I do think having good products that win approval in reviews is one of the most important parts. Look at the GTX 1650, it was universally panned in reviews and in online comments. No one doing any research at all would buy one of these cards unless they have an old PC without power connectors. Everyone online and in reviews says buy an RX570 instead.

IMO those two matter more than almost any advertisements.
 
That is actually a big part of marketing. It's called "Halo Products".
Yeah, I could have sworn that I'd used the term "halo product" in my post, but reading it back it appears that I haven't :D I meant to, though!
 
Status
Not open for further replies.
Back
Top