people are still downing amd for a release of a product that within range of their nv's high end gpu. yes it doesn't have feature parity with what nv offers and it does not matter not a single game dev is looking at this extra eye candy shit in anything till the consoles support ray-tracing and that is not happening till amd supports it. go cry me a river jhh and nv zelots as you can create the best platform and yet you are knee capped cause you are not in any consoles what all game devs focus is on.
 
What do people do when they're nervous? They lash out. Suddenly Jensen is lashing out a lot. Graphics architecture aside, AMD being on a lithography that's half the size of his, is making him nervous. As it should.

You mean 7nm double triple patterning right, IF NVidia was smart they would wait for euv/uvl in 2020 and just crush them.
 
people are still downing amd for a release of a product that within range of their nv's high end gpu. yes it doesn't have feature parity with what nv offers and it does not matter not a single fucking game dev is looking at this extra eye candy shit in anything till the consoles support ray-tracing.

Yeah they are, even then DLSS is where it is at, at least until they are more powerful than just lighting and reflections. Higher frame rates or uncanny valley hmmm let me see here.
 
Yeah they are, even then DLSS is where it is at, at least until they are more powerful than just lighting and reflections. Higher frame rates or uncanny valley hmmm let me see here.
its been months since launch and less than 5 games use it. jhh needs to stop bitching and put up or shut up.
 
AMD is in both the next Xbox & Playstation, I'd say Nvidia is fucked. Sure, they'll sell $2000 dollar cards, but the console market is where the real money is.
 
Remember that the standard has been set by G-Sync, which has no ratio: it's simply 30Hz to whatever the max of the monitor is. The talk about ratios came about with the need to categorize the plethora of crappy "Freesync" implementations.
Uh, sorry?

That is one of the standards Nvidia uses for certifying a VESA async monitor as “Gsync compatible”.

Sadly the only legit source I could find is Gamersnexus in video, apologies for the lack of a text option. About 3:20 onwards on the video.



The 2.4 ratio is pretty much the only reason why so few monitors were validated as Gsync Compatible and being able to meet the standards Nvidia used.


On the Freesync side, LFC requires merely a ratio of 2.0, down from an initial 2.5.

I suspect you would get many more monitors approved of the 400, if 2.0 was the minimum.

And given it is all the VESA standard... either Nvidia is limiting the approved number of monitors for marketing reasons, or AMD just can do LFC better in the current standard.

Both of which seem possible. If not a mix of both.
 
AMD is in both the next Xbox & Playstation, I'd say Nvidia is fucked. Sure, they'll sell $2000 dollar cards, but the console market is where the real money is.
Not really, it's volume sales but the profit margin is pretty low to keep them cheap. However, it should give AMD some pulling power on developing standards.
 
AMD is in both the next Xbox & Playstation, I'd say Nvidia is fucked. Sure, they'll sell $2000 dollar cards, but the console market is where the real money is.

long term residual income for the win with a dedicated install base i say the ball is in amd's court as far as what features get supported by devs
 
AMD is in both the next Xbox & Playstation, I'd say Nvidia is fucked. Sure, they'll sell $2000 dollar cards, but the console market is where the real money is.
AMD probably makes 10-20 dollars per console sold, in gross profit currently, depending on the console.

A 2060 should gross profit somewhere between 80 and 120 dollars by my estimates. 60-75% margin.

2080? Probably 350-450 dollars gross profit. 80%+ margin

AMD’s profit per chip will rise with new consoles, but the margin will still be 15-20%.


Console business is good for AMD, and this generation saved AMD, but there is money to be made on both places.
 
  • Like
Reactions: Mega6
like this
Uh, sorry?

That is one of the standards Nvidia uses for certifying a VESA async monitor as “Gsync compatible”.

"The standard is set" means that G-Sync is the standard. Every Freesync implementation falls short of G-Sync. A few are close.
 
However, it should give AMD some pulling power on developing standards.

But haven't the standards been 'developed'?

Despite AMD powering the next Xbox, Microsoft worked with Nvidia for DXR. If anything, Nvidia is dictating standards to AMD through Microsoft.
 
AMD is in both the next Xbox & Playstation, I'd say Nvidia is fucked. Sure, they'll sell $2000 dollar cards, but the console market is where the real money is.

The gaming GPU market is where the real money is. The money in the console market goes first to Sony and Microsoft, and then to game devs- the hardware IP providers get slim pickings. AMD produces nothing here, and unlike their GPUs and CPUs, they're not in charge of console part production either.
 
"The standard is set" means that G-Sync is the standard. Every Freesync implementation falls short of G-Sync. A few are close.

Excluding Gsync Ultimate, the claim is dubious.


Especially with laptop displays being certified as Gsync displays. I’m very sure there are multiple Freesync monitors better than the “Gsync” Async panels in some laptops.

Do note those are Gsync, not “Gsync compatible”
 
Excluding Gsync Ultimate, the claim is dubious.

Every G-Sync monitor has every feature that the very best Freesync implementation has, and syncs from 30Hz to the panel max. Nothing dubious about it.

Especially with laptop displays being certified as Gsync displays. I’m very sure there are multiple Freesync monitors better than the “Gsync” Async panels in some laptops.

Do note those are Gsync, not “Gsync compatible”

We're all aware that Nvidia went a different route for laptops. Using an off-topic point undermines your argument, and you'd have to show that G-Sync as implemented in laptops is actually inferior to G-Sync as implemented in desktop GPUs and monitors.
 
AMD probably makes 10-20 dollars per console sold, in gross profit currently, depending on the console.

A 2060 should gross profit somewhere between 80 and 120 dollars by my estimates. 60-75% margin.

2080? Probably 350-450 dollars gross profit. 80%+ margin

AMD’s profit per chip will rise with new consoles, but the margin will still be 15-20%.


Console business is good for AMD, and this generation saved AMD, but there is money to be made on both places.


Let me be clear as possible consoles mean steady income selling one chip design over a 5-10 year life span. amd doesn't have to worry about making the greatest gpu in the world right now and maintain feature parity with nv in features and software why? it is console hardware that defines gaming and were devs allocate resources. amd saw an opportunity for a steady income and define a generation of gaming and they taking full advantage of it. right now amd is in the drivers seat and will continue to be regardless of jhh's squawking
 
Last edited:
let me be clear as possible consoles mean steady income selling one chip design over a 5-10 year life span. amd doesnt have to worry about making the greatest gpu in the world right now and maintain feature parity with nv in features and software why? it is console hardware that defines a generation of gaming and were devs allocate resources right now amd is in the drivers seat and will continue to be so.

It's console hardware that influences the beginning of the generation; however, developers and publishers have realized that properly supporting PC gaming can be very rewarding and move past the limitations set by consoles quickly. I fully expect AMD to ship ray tracing hardware in the upcoming console generation, but even if they fail to do that, PC gaming won't be held back.
 
It's console hardware that influences the beginning of the generation; however, developers and publishers have realized that properly supporting PC gaming can be very rewarding and move past the limitations set by consoles quickly. I fully expect AMD to ship ray tracing hardware in the upcoming console generation, but even if they fail to do that, PC gaming won't be held back.

I fail to see where he is implying PC development will be held back.

I own all nVidia hardware, but to pretend that holding the consoles isn't a big win, or even possible game changer is silly. We are lucky that the GPU ecosystem is relatively open, if it was proprietary systems (like many other areas have been) nVidia would be at a disadvantage.
 
to pretend that holding the consoles isn't a big win, or even possible game changer is silly

But it really isn't. AMD is only supplying hardware IP. Microsoft and Sony are the big winners; Microsoft especially, given the commonality of architectures and that their API is similar to DX12.

So AMD gets a small stream of revenue, but they don't really gain any influence. Microsoft and Sony decide where the platforms go, as they are AMD's customers here.
 
With how underwhelming the 20xx cards are and how great the 1080ti is, why would you feel the need to upgrade? What game are you trying to play you can;t play maxed-out? Is Crysis 4 out or something?
I have more than one PC. I usually employ a "trickle down" effect so I was hoping to place this 1080ti into my arcade PC(one of my arcade cabinets has a 4K display)
 
RTX and DLSS are tangible features with tangible effects, also more future proof. You get support of DXR. Vastly less power consumption too.
"Tangible", right.
With one game and only "playable" at 1080p on the RTX 2080 Ti - the 2080 and 2070 are officially unplayable with RT enabled.

At least with 16GB of VRAM there are no worries about running out of said VRAM with HD or 3rd party high-res texture mods to existing titles and other software.
Not to mention it doesn't cost $1200+ with a 10-20% reliability lottery.
 
Last edited:
No, DXR is an industry standard now, the industry is moving into the direction of ray tracing, AMD is just late to the party as usual.
Kind of like how DX10 was the new "industry standard" and PhysX was the new "industry standard".
This is turning into a repeat of 2007/2008 all over again - I did this dance and pony routine back then, and you know what they say, "Fool me once..."

DX10 was a joke and never worked worth a shit - it was later on that DX11 became the true successor and next industry standard that officially replaced DX9.0c.
Also, (GPU-based) PhysX was never a required feature, and was almost always tacked on after-the-fact, which is exactly what RT is starting to look and feel like - while it was nice to have, it was never worth the trouble and definitely not the additional costs (both monetary and performance) to have it enabled.

I'm not saying that RT won't become the official industry standard, as you stated, but it is going to at least be a few more years and a few hardware generations down the line if it is picked up at all in any serious manner.
Maybe in the next iteration of DX and RT, but for now, this is basically an extremely high-cost premium consumer demo of things to come - definitely not worth the $1200 it takes to pull it off, and just barely at that.
 
Last edited:
But it really isn't. AMD is only supplying hardware IP. Microsoft and Sony are the big winners; Microsoft especially, given the commonality of architectures and that their API is similar to DX12.

So AMD gets a small stream of revenue, but they don't really gain any influence. Microsoft and Sony decide where the platforms go, as they are AMD's customers here.

as i said before its a stream of revenue for a single design for 5-10 years btw this little development helped put amd back in the green.

yes game devs are building console games for amd based hardware using the intel cpu's code path and nv gpu's rendering path. not. its all amd's code and render path my friend.
 
Last edited:
Every G-Sync monitor has every feature that the very best Freesync implementation has, and syncs from 30Hz to the panel max. Nothing dubious about it.
So a freesync panel which met the 30-144/whatever refresh rate with quality panel... would be identical, yes? And hence better due to a lower price/lower power consumption.


We're all aware that Nvidia went a different route for laptops. Using an off-topic point undermines your argument, and you'd have to show that G-Sync as implemented in laptops is actually inferior to G-Sync as implemented in desktop GPUs and monitors.
How dense are you?
https://www.geforce.com/hardware/technology/g-sync

Nvidia has “Find online retailers offering NVIDIA G-SYNC monitors and laptops”.
According to Nvidia, VESA async is Worthy of the Gsync brand.

So did AMD regresss the standard? Not a single monitor fits to the VESA standard?

Because what I see is two general ways your opinion could be valid regarding how Freesync stacks up to Gsync:
1. Without Nvidia no one can properly follow and implement the VESA standard.
2. No one wanted to make a solid Freesync monitor with high end monitor features. They just marked up prices and lied about features supported.

Sorry if it feels like I am putting words in your mouth. But Nvidia does not distinguish between Gsync with module and Gsync with VESA adaptive sync. Except when it comes to “freesync” panels. Almost like it is purely marketing FUD...

Oh, and Nvidia has “G sync” panels in laptops that fail to meet their own standard for Async desktop monitors which officially work with their GPUs.
 
But it really isn't. AMD is only supplying hardware IP. Microsoft and Sony are the big winners; Microsoft especially, given the commonality of architectures and that their API is similar to DX12.

So AMD gets a small stream of revenue, but they don't really gain any influence. Microsoft and Sony decide where the platforms go, as they are AMD's customers here.

It really is, on top of the other arguments made it makes AMD the standard to which titles are designed, the console market is huge compared to the PC.

Yes the odd hairworks title beats up ctheir cards but consoles have kept AMD in the game and given them large market share with ongoing revenue on what quickly becomes legacy hardware.
 
It really is, on top of the other arguments made it makes AMD the standard to which titles are designed, the console market is huge compared to the PC.

But this does not make them 'the standard'; the standards are x86 and DX. Developing for consoles means developing for consoles.
 
But AMD will save us from Ngreedia! They care about their users more and always offer more value and innovation. Lol nope they took advantage of the high prices and literally offered no real performance increase for dollar over the 2080 or even the old 1080 ti that many of us have been using for nearly 2 years. Think about how fucking stupid it is that all we can get from either company in 2019 is a side grade at best for the same amount money we paid at the beginning of 2017.
 
When you say "above" 2080 performance ... are you just inventing this? I mean, the head of AMD said below"

Interesting. I have followed all this, watched the presentation and can't find when Su said this. A link to the quote would be appreciated.
 
So, you are trolling, or are a paid shill for Nvidia, or who knows what. Got a grudge? Is what you're saying the script that Nvidia or some 3rd-party middle-man PR company have given you to try to mislead people with?

To be clear, the information that we have shows that Radeon 7 is ABOVE the performance of an RTX 2080.

And a 1070 Ti isn't what comes below the performance of an RTX 2080, so it's absolutely ridiculous to even be mentioning it at all - and you've been corrected about this multiple times.


Here's a ranking of comparable performances to make this easy to follow:

(least performance) 1070 ti < 1080 < 1080 Ti < 2080 < Radeon 7 (most performance)


The only consumer Nvidia GPU that offers more performance than Radeon 7 is the RTX 2080 Ti.




The only benchmark I'm aware of is this:

View attachment 134251




Troll elsewhere, please.

AMD is doing some great things. I was very impressed with the fact that Lisa Su didn't snap back at nVidia after nVidia's CEO called their new card, the Radeon 7 "underwhelming." While I doubt me personally would ever have any interest in a AMD graphics card I am very much looking forward to their new Zen 2's that are coming out shortly. Hoping this CPU truly beats Intel in all metrics, althought, I am concerned it's a multi-chip design. I do worry about interconnect I/O bottlenecks.

But, I do want to share a link. It's important that we not only bring clarity to this subject but a sense of truth and justice. When people allow emotions to run away and It happens often, too often, that emotion and sensitivity ends up doing this community a disservice. I cannot sit here idle and allow a misrepresentation of the facts to occur. The last thing any of us want is for some kid to rush out and blow nearly $800 ( shipping and tax factored in ) on a card that's not going to give him much more performance than a 1080 Ti. And these are not my words or numbers. I'm just repeating what I've read and heard. What's available on the Internet.

So Far, The Radeon 7 Beats The RTX 2080 (But Only In One Game)

Link:

https://www.kotaku.com.au/2019/01/amd-quietly-says-the-radeon-7-beats-the-rtx-2080-in-one-game/


In regards to you calling me a troll, it just doesn't work like that. You really should not invent things off the top of your head. I said I was playing Devils Advocate ..... I clearly said this. I'm not sure of the intent here, or the tangent. But, I do get it. This Radeon 7 is not the best news that we were all hoping for, especially the price. People are disappointed, hurt, sensitive and a bit fragile at this news, and, I get it. And that's OK. I'm fine with the display of emotions. Just ...we all need to take a breath before calling someone a Troll. My post doesn't even remotely meet that definition.

I propose this. Get your $700 + $15 shipping, $70 tax or .. nearly $800 dollars together, buy this card and show this community you support AMD. Not only that, but that you stand behind your words. I think that would be a great outcome and honestly, would help me to respect you a little bit more. A win win for both of us.

Again, the Devils Advocate in me is going to "gently" remind you and others that for about $300 to $500 you can nearly if not possibly beat the performance of this new card. I am pretty damn sure the 1080 ti is going to be within 5 or so frames of the Radeon 7. And what's really cool is, and this is a real option for potential buyers is that the Radeon 7 is now considered a "dumb card" techwise by today's video card standards, in that the Radeon 7 doesn't have RT, A.I. or DLSS, etc. So there is no loss going to a 1070, 1070 ti, 1080 or 1080 Ti over a Radeon 7. Potential buyers do not have to sacrifice any tech between the two and that's great news.
 
Again, the Devils Advocate in me is going to "gently" remind you and others that for about $300 to $500 you can nearly if not possibly beat the performance of this new card. I am pretty damn sure the 1080 ti is going to be within 5 or so frames of the Radeon 7. And what's really cool is, and this is a real option for potential buyers is that the Radeon 7 is now considered a "dumb card" techwise by today's video card standards, in that the Radeon 7 doesn't have RT, A.I. or DLSS, etc. So there is no loss going to a 1070, 1070 ti, 1080 or 1080 Ti over a Radeon 7. Potential buyers do not have to sacrifice any tech between the two and that's great news.

Friendly reminder that I seem unable to find a new, in stock 1080ti for under $1000.
2080's are readily available at $699 though.
 
In regards to you calling me a troll, it just doesn't work like that. You really should not invent things off the top of your head. I said I was playing Devils Advocate ..... I clearly said this. I'm not sure of the intent here, or the tangent. But, I do get it. This Radeon 7 is not the best news that we were all hoping for, especially the price. People are disappointed, hurt, sensitive and a bit fragile at this news, and, I get it. And that's OK. I'm fine with the display of emotions. Just ...we all need to take a breath before calling someone a Troll. My post doesn't even remotely meet that definition.

I would say that whether any of your posts befit the definition of trolling depend on whether they were innocent mistakes. These are the parts that I took issue with:


"It really doesn't have anything new and the performance is still below that of a 1080 ... around the performance of a 1070 ti ... maybe?"

"So AMD says this new card is just below a 2080 ... that put's it at about a 1070 ti ... right?

Proof - https://gpu.userbenchmark.com/Compare/Nvidia-RTX-2080-vs-Nvidia-GTX-1080-Ti/4026vs3918"

"If it's below a 2080 and the 2080 and the 1080 ti are give or take 5 or so frames apart then does that mean it's the performance of a 1070 ti?"

"The 7 stands for the 7 in 1070 Ti performance I heard. Not sure if that's true but makes sense."


Maybe you just didn't know the performance level of a GTX 1070 Ti and thought that it's close to an RTX 2080. But, you kept phrasing your comments as though you're trying to seed the idea to people that a Radeon 7 is closest to a GTX 1070 Ti in performance, when that's a baseless idea because Radeon 7 is close in performance to GPUs that are much faster than a GTX 1070 Ti. There are indications that Radeon 7 is on par with an RTX 2080, with an RTX 2080 being 3 Nvidia GPUs ahead of a GTX 1070 Ti.

You also posted a link to user benchmarks for RTX 2080 and GTX 1080 Ti as alleged "proof" that Radeon 7 will be around the performance of a GTX 1070 Ti, using phrasing that resembles weasel words:

For those of you that don't truly really know your numbers and there are a surprising amount of you on here, I witness it every single day and or you rely on 2nd hand hear-say ... the 1080 ti is actually 2% faster than the nVidia 2080. So AMD says this new card is just below a 2080 ... that put's it at about a 1070 ti ... right?

Proof - https://gpu.userbenchmark.com/Compare/Nvidia-RTX-2080-vs-Nvidia-GTX-1080-Ti/4026vs3918


Regardless of your intention, it isn't an accurate suggestion and it was corrected. But, even after it was corrected in quotes to you, you posted it more times, ignoring the correction. So, there's that.


My pointing this out isn't about being a fan of AMD or a critic of Nvidia. I'm not planning to purchase a new GPU at this time and when I do it probably won't be for the obscene prices both AMD and Nvidia are asking for their latest generation of cards.
 
Last edited:
But it really isn't. AMD is only supplying hardware IP. Microsoft and Sony are the big winners; Microsoft especially, given the commonality of architectures and that their API is similar to DX12.

So AMD gets a small stream of revenue, but they don't really gain any influence. Microsoft and Sony decide where the platforms go, as they are AMD's customers here.

I'd say the key influence AMD gets is with their fab. Volume speaks volumes.
 
Back
Top