AMD's best Big Navi GPU may only be a match for Nvidia Ampere's second tier

Yet they actually have plenty of partners, and customers. Look at AI, Look at datacenters. They know what business partners to seek from the looks of it. Maybe a more accurate statement would be "hated by x% of forum posters..", this is obviously true, while your statements are not.

Lol, this is gospel right here.
 
I personally expect the Big Navi to at least be equal to the 3080 on games that are not exclusively optimized for Nvidia only. Of course, the same will be the other way around, the 3080 will probably be slower than Big Navi on titles that are optimized for AMD and not Nvidia. That said, I have zero reason to upgrade, since I have upgraded by SSD's, CPU's and one motherboard this past year.
 
I find that those that call others fanboys are actually the fanboys. I liked your other response by the way considering optimizations, just not this one. It really offers nothing more than name calling.
Was not calling you or anyone else a fanboy. Just pointing out that blindly praising a company without context needs to stop. It's really not good for anyone. Sorry for the confusion.
 
Source? Or wishful thinking? Sounds like the latter.

Source ? The fact that they will be powering both next gen consoles... with both console players talking about RT.
Like it or not even if AMDs solution is inferior from a hardware perspective. They end up with the better software support. RT paths in every engine and every game that supports it is going to be optimized for PS5 and XB.
 
Was not calling you or anyone else a fanboy. Just pointing out that blindly praising a company without context needs to stop. It's really not good for anyone. Sorry for the confusion.

Ok thanks for clearing that up. I appreciate it. Sometimes I seem like a fanboy but truly I am fan of both nVidia and AMD and I love them both and I just want more options so AMD step it up please this time.
 
Drop in the bucket don't you think? Intel still controls over 60% marketshare in the CPU space. But sure, drop by drop they are finally starting to be a real contender. But let's all be smart adults here, we all know Intel is working the problem and will eventually come up with something that competes in multi-core. We're already hearing of possible competition in tiger lake, or am I mistaken?

Nvidia also makes more than just GPUs, they are in automotive, AI and cloud. So it's not fair to say that they only make GPUs and poor little AMD has to make both CPUs and GPUs to try to compete. And you're saying Nvidia can't lose? Sure they can, but the fact is they aren't losing.

This is the type of fanboy'ism that needs to end.
Tiger Lake is supposed to have less cores than current gen... so maybe you are mistaken on the coming up with something to compete multi-core and using tiger lake in the same sentence. They are going full out to maintain their slight gaming edge and giving up on high core counts until they can move to a process that can handle tighter transistor packaging.

He was saying that if NVIDIA falls in the GPU space, they don't have much other markets or interests to fall back on; although he mentioned AI and networking (Since buying mellanox). AMD can fail 100% in GPU and still have their CPU business. Basically saying if NVIDIA loses that it would basically destroy them (maybe not completely as they are starting to shift into other niche markets). He isn't saying they can't fail like it's impossible, just that they have a bit more motivation not to fail since they don't have a CPU business to fall back on, where AMD can fall behind and not be in the lead but still be fine as their other business helps keep them floating.
 
"Coreteks' AIB sources also suggest there will be a pair of Big Navi GPUs at launch, based on the same Sienna Cichlid silicon. That's not much of a stretch as this is classic AMD graphics card practice; it's very own rule of two. This typically sees a cut-down card launching alongside the full-fat option. Just think about the RX 5700 XT and RX 5700 last year, RX 580 and RX 570, R9 390X and R9 390… and on and on.

And traditionally we usually end up recommending the lower-spec one as its lower price, and generally only-very-slight specs cut, end up making it a better value card by comparison, offering often very similar performance.

It looks like those two Sienna Cichlid cards will be the only Big Navi GPUs we see this year too, if this report is to be believed. A second fish-based codename, Navy Flounder, has been tied to the new RDNA 2 architecture, and is reportedly a mid-range version of the Big Navi GPU slated for release in the first three months of 2021.

With only the high-end market set to be served at the tail-end of this year, it seems almost certain that most of us are going to have to wait until next year to see anything more affordable than the ultra-enthusiast, ultra-expensive GeForce and Radeon cards."


https://www.pcgamer.com/amd-big-navi-rdna-2-aimed-at-rtx-3080/

Coreteks brain has not been working too well lately. While I will not claim that AMD can beat the 3090, I will claim they can beat the 3080. That is if they raise the power level to 300 watts or slightly higher . Their card can boost to over 2200mhz . If they run it stock at 1950mhz and boost to 2230 mha that beats Nvidia by over 10% in frequency. Recent rumors report a 10% improvement in IPC with RDNA2, but that contradicts all that has been reported before of a 15% improvement in IPC. If it is in fact 15%. That will help whittle away some of the difference in performance. ALL previous rumors indicated a 30 to 40% performance edge over the 2080 Ti. That means their top card should slightly edge a 3080. With 16GB of memory likely on the top tier AMD card it will be a far better bet than the 8GB memory of the 3070 or the 12GB of the 3080.
 
Tiger Lake is supposed to have less cores than current gen...
I'm seeing max quad-core... which doesn't help when I'm looking at 15" models with H-series octo-core CPUs, and AMD octo-core U-series CPUs are rocking along with sixteen threads in the 15w - 25w range.
 
Poor Nvidia, nothing to fall back on except gaming cards, running scared of AMD, releasing cards with too little vram.

I have no idea how they could ever have gotten to a 80% marketshare. Cheaters!

Nvidia has a lot more going on than just gaming cards.
The 10GB vram being an issue has yet to be seen.
 
Poor Nvidia, nothing to fall back on except gaming cards, running scared of AMD, releasing cards with too little vram.

I have no idea how they could ever have gotten to a 80% marketshare. Cheaters!
Meh, you're obviously going to take it as you will. The point was AMD doesn't rely on their graphics division as much as Nvidia. If you don't understand that then you just don't want to. It wasn't a dig at either company, it's just where they are. Nvidia has been and continues to be more successful, they can't afford not to be, while AMD can afford to not be as successful. It's not that AMD wants to suck at it or be second best. Nvidia has just been able to execute better. The point had nothing to do with whatever your spouting off about. Being focused on one thing doesn't mean you can't be good at it, so no clue where you got that "how did Nvidia get market share" comment from as it's not based in anything being discussed, and more likely Nvidia being more focused is part of the reason it's doing better.
 
The 10GB vram being an issue has yet to be seen.
I doubt that it will be an issue in the short term but I think it's likely to be an issue later in the cards useful lifespan. One thing to keep an eye on is how developers leverage the faster storage on the next consoles and whether it leads to higher VRAM usage on PC.
 
I doubt that it will be an issue in the short term but I think it's likely to be an issue later in the cards useful lifespan. One thing to keep an eye on is how developers leverage the faster storage on the next consoles and whether it leads to higher VRAM usage on PC.

Something to keep in mind is that usage =/= actual necessity. For instance, the latest Dragonquest chews up a ridiculous amount of vram, and it's a cell shaded game. Final Fantasy XV even on 1080p chews up nearly the entirety of the vram on my 2070maxq (8gb iirc). Developers can send all kinds of data to vram for a myriad of reasons. Those can be for necessity, or to help with seamless gameplay experiences, or simply garbage optimization (Dragonquest, you bitch).
 
Nvidia has a lot more going on than just gaming cards.
The 10GB vram being an issue has yet to be seen.
As far as ther 10Gb of vram , right today there are a few games that will perform beeter with 12GB or 16GB. In a year that number will be closer to 100 and in 2 years hundreds. It is short sighted to issue a high level cord with out extra memory period.
 
I know Deus Ex MD can use at least 11GB of RAM, probably much more on max settings. RE2 as well.
 
I know Deus Ex MD can use at least 11GB of RAM, probably much more on max settings. RE2 as well.
Microsoft Flight simulator can take any amount of Vram, you throw at it. Even the rumored 48gb ampere titan, I guess.
 
I know Deus Ex MD can use at least 11GB of RAM, probably much more on max settings. RE2 as well.
Then the question is, how do they run with 8-10GB @4k?

I'm guessing they are totally "Playable"...
 
No, not playable at all Deus Ex was like 10 fps, but I don't know that was only because of the RAM.
 
No, not playable at all Deus Ex was like 10 fps, but I don't know that was only because of the RAM.

Odd, Guru3D had decent results at 4k:

untitled-45.png


https://www.guru3d.com/articles-pages/geforce-rtx-2080-ti-founders-review,20.html
 
They are not even failing in PC gaming. Source after source of information and statistic has Ryzen CPUs selling to more gamers than Intel. Microcenters AMD section is bigger than Intels and I am not kidding.

There is that giant German Retailer Mindfactory or something like that, that confirms Ryzen sells more than Intel in Germany.

As far as the GPU space well... thats different but nVidia just only makes gpu, so your gonna see way more of those. AMD does cpu and GPU and mostly CPU.

Its not fair to say AMD is failing especially when they are not invested all in like nvidia. Nvidia cant lose, AMD doesnt need to worry when it comes to GPU, they are steady like a freight
Train gaining CPU market share globally and will continue as Zen 3 looks to smoke anything Intel has to offer and nVidia doesnt even make CPUs. And if nvidia tried theyre 30 years behind Intel and AMD.

I would compare saying, because amd gpu cant keep up with nVidia is like saying Nissan is a failure because the Leaf cant compete with Tesla. Nissan makes an incredibly large array of products that diversify and strengthen the company while Tesla makes a couple cars based on a single essential tech. Id be far more worried about nV failing over AMD, nV doesnt have a product to fall back on really. Maybe AI and now networking.

Longshot but what if AMD RDNA 3 or 4 absolutely ends up smoking nvidia? Dr Su is a bright ass CEO. Jensen is kind of a douche and is very Tim Cook like. Where as Su is more akin to Jobs.

And please forgive me. I dont speak with authority. I make mistakes and this is opinion and I may even sound foolish.

https://www.forbes.com/sites/davidc...jobs-was-a-jerk-you-shouldnt-be/#74dab9d04045

You have your characters mixed up. Tim Cook is actually the nice one while Steve jobs was known for being a jerk.

Tim Cook helped open Apple back up, made them a more generous company with regards to philanthropy.

https://www.wsj.com/articles/tim-co...china-iphone-ipad-apps-smartphone-11596833902

Tim Cook is actually a nice guy in comparison.

Steve Jobs was about forcing proprietary closed solutions on the public and about changing public perception that their way, was the best. If anyone is like Steve Jobs in that sense, it is Jensen.

https://www.theverge.com/2013/2/10/3973804/apple-tim-cook-was-opposed-to-samsung-lawsuit-reuters

Steve was all about protecting anything they created, making sure they were the beneficiary and was much more lawsuit prone.

I think for better or worse, Steve Jobs is much more like Jensen while Tim Cook is much more like Lisa Su. Tim Cook like Su has been an execution machine, taking the products they already make better while really not making and branching off unto new industries. Lisa Su has been a great CEO but she seems much more like an engineer that gets stuff done which Tim Cook is also famous for. However, Su has not really taken AMD into uncharted territory unless you count AMD making bicycles.

Jensen like Jobs have a strong visions of seeing their products everywhere in new ways which have turned them into super successful companies. It has resulted in some failures like their Tegra collapse in the cell phone market but it has resulted in successes like their professional visualization market, Cuda and the AI data center market. Under Jensen, Jensen have taken GPU's into new markets and industries to the point where Nvidia GPU product revenue eclipses AMD CPU and GPU revenue with the former being a much larger market. This is much more akin to Apple turning their UI and design strength with MP3 players and creating the iphone and app ecosystem. This vision, which really hasn't been demonstrated by Su yet(name one product from AMD where they obtained first mover advantage and dominated a market from the beginning), is what makes Jensen and Job's such strong CEO's.

While Jensen might not be liked around here, his competency as a CEO is unquestionable, even topping the list of best CEO in the world last year.

https://hbr.org/2019/11/the-ceo-100-2019-edition
 
AMD Big Navi GPU pricing rumored to be slashed in light of Nvidia's Ampere reveal

An unnamed AMD partner is claiming that, as a result of the recent Nvidia Ampere RTX 30-series unveiling, the red team is expected to cut the cost of its 16GB AMD Radeon RX 6000-series card to better compete with the new GeForce GPUs...

https://www.pcgamer.com/amd-rx-6000-series-price-drop-pre-launch/


Already been discussed. Coretek is talking about one card and everyone is saying its more likely it’s Navi 22 and not the naiv 21 based card.
 
Something to keep in mind is that usage =/= actual necessity. For instance, the latest Dragonquest chews up a ridiculous amount of vram, and it's a cell shaded game. Final Fantasy XV even on 1080p chews up nearly the entirety of the vram on my 2070maxq (8gb iirc). Developers can send all kinds of data to vram for a myriad of reasons. Those can be for necessity, or to help with seamless gameplay experiences, or simply garbage optimization (Dragonquest, you bitch).

the other issue is that developers can also mask the actual usage by pre-allocating/requesting vram as in the case of one of the COD games where it would request something like 80-90% of the available vram so any monitoring app including windows showed it using almost all the vram when in reality it was only using a fraction of it.
 
the other issue is that developers can also mask the actual usage by pre-allocating/requesting vram as in the case of one of the COD games where it would request something like 80-90% of the available vram so any monitoring app including windows showed it using almost all the vram when in reality it was only using a fraction of it.

GPU vRAM usage need to be finer "grained".
How much is used...and how much is cached.
The balance between those two numbers holds many answers.
 
Already linked above. Lol I think he is just trolling people. I doubt they announce big Navi with no pre announcement. If anything they might announce an event.
Or it could be zen/cpu related announcement
 
Already linked above. Lol I think he is just trolling people. I doubt they announce big Navi with no pre announcement. If anything they might announce an event.


Well Xbox series X just announced preorder date and so may be he was referring to that. Too much shit going on lol.
 
They might want to get the details out before it’s drowned out by iPhone 12 hype next week. Also the new rtx cards may be better than they expected so they need to get information out in front of the 3080 release.
 
They might want to get the details out before it’s drowned out by iPhone 12 hype next week. Also the new rtx cards may be better than they expected so they need to get information out in front of the 3080 release.

Well. Just saw this on video cards. I was suspecting this might be the case. Best case it shows 40-50% faster than 2080ti in games nvidia was showing but looks like it’s more of 30%. More reviews to be seen. But overall I think it’s going to end up being 60% or so faster than 2080 and 30% on 2080ti average. But more reviews to be seen yet. 3070 is probably on par with 2080ti or might be a notch slower in 4K.

https://videocardz.com/newz/alleged-geforce-rtx-3080-graphics-card-test-leaks-online
 
30% is a bummer.... Won't make Max settings @4k RDR2 really smooth.. Let's hope 3090 adds another few %
 
Back
Top