You don’t get nothing. You get, apparently, 2080 raster performance and 16GB of VRAM for $699. That’s not nothing, particularly considering early benchmarks of the 2080 are showing you “get ray tracing”, but at a performance hit significant enough that most gamers are either not going to use it, or will be limited as to where they actually can. I don’t consider ray tracing in this generation of Nvidia cards as something that would make me want to go out and make a purchase given how terrible it’s performing.

So it"s bad for NVIDIA to launch 1080Ti performance at 700% with DLSS and RTX, but it's good for AMD to do the same but with 16GB of VRAM? Get off your hypocritical horse.
LOL! Man, you sure do love your Nvidia. DLSS is of no value on the hardware that exists, unless you have no issue running your games at 1080p on a $1200 card. :D Now, the fact that some bought this card with a promise of what might be is on them. Me? I would rather have a card that can be fully utilized on all games now and not be limited in the future. Heck, and this VII is just a refresh and improvement on what already exists, makes me look forward to the new stuff that should be out later this year. (Either that, or you got jealous that Lisa's leather jacket was better.) :D

Edit: Right, games that add DXR as an after thought will not run on VII cards because, reasons.........
DLSS gives you free fps at minimal cost to IQ, it"s of tremendous value. DXR gives you unprecedented IQ enhancements. Never before seen in gaming. Vega VII will be massively limited in the future as it has no DXR whatsoever.
I bet you were one of those guys who bought a stand alone PhysX card card weren't ya? :D
And you must be on of the geniuses who bought a DX9 card when DX11 was released.

This is DirectX10 all over again where Nvidia was first to have it but turning on DX10 shadows the games run twice as slow. Sound familiar? Then AMD released their DX10 cards but they were DX10.1 and that gave a massive performance boost to games that utilized it like Assassin's Creed, except that magically the support for DX10.1 was removed and never seen again.
Please don't stitch together a bunch of mumbo jumpo, this has nothing to do with DX10. DXR adds tangible IQ enhancements. DX10 didn't do squat.
This has to do with AMD being behind on everything, performance, power consumption, efficiency, features, and even price! To the point they can't support a major branch of DX!
As for Ray-Tracing you don't really need DXR to do it. Hybrid Ray-Tracing is not a new thing and could be done on DX11 as show by this Japanese demo. Both the Unreal engine guys and Battlefield V said they had Ray-Tracing working without Nvidia's RTX cards. Makes one wonder if you just need a good CPU instead of wasting half a GPU to do Ray-Tracing?
RTX accelerates ray tracing genious, without it you cant run ray tracing with 10fps in real time!

This Vega 7 joke has the marks of a trash vega rehash all over again, even jim from AdoredTV thinks it's an embarrassment to AMD.
 
15471366628pyisvxx1q_1_1_l-jpg.jpg
 
Saying Freesync has never worked is highly underhanded!

FS issues are just due to the amount of panel manufactures and the large amount of quality between panels. A Samsung FS panel will perform different than a Chinese/monoprice knockoff panel. I have used a cheap and expensive FS panels and noticed immediately issues like ghosting and stuttering. But to say it never worked is horseshit. Higher end FS panels can be great and worth the cost.

I still think Gsync performs better but it comes at a premium that most mainstream don't want to pay for.

RTX accelerates ray tracing genious, without it you cant run ray tracing with 10fps in real time!

This Vega 7 joke has the marks of a trash vega rehash all over again, even jim from AdoredTV thinks it's an embarrassment to AMD.

That is hilarious Jimmy is distancing himself from the Radeon 7 after his videos, backed by "leaks," said it was going to be so super awesome you don't even know!!!

Adored/Jim is a trash panda / gossip vlogger and nothing more. When he is right he behaves like Moses and when wrong he immediately does damage control.
 
Last edited:
RTX 2080 is $100 more with less performance. How exactly is it a better value again?

Because the MSRP may be $100 less, but until they are for sale we don't know for sure how the two cards will stack up against each other. And you say "less", but even AMD's own numbers basically have it being a wash, on average. Either way, "less" is only slightly less.

Depending on what you want, that $100 for RT and tensor stuff may well be more than worth the price difference. Not for me, personally, but it's not a hard viewpoint to see. I was hoping for something a bit more of a concrete counter punch from AMD.
 
Jenson just doesn’t just trash AMD, also Intel with their Graphics Division. Jenson has issues and decisions to deal with and it speaks volumes in his comments....

- Jenson is disappointed that NO other Graphics Developer is supporting Ray-Tracing, therefore no demand for Game Developers to push for it.

- Since there is a great number of Monitors supporting Free-Syn and Jenson is disappointed that he is now has supporting it too.

-Jenson doesn’t like the ideal that 4 major Corporations in Gaming are using AMD Graphics, Microsoft with XBOX-One, Sony with PS4, Intel with integrated Graphics and AMD is self with APUs & GPUs.

This reminds me of the HD-DVD vs Blu-Ray wars and the one that won was one with the most support from the Studios and Sony had a huge influence to getting the Studios on board. I don’t think Ray-Tracing is dead, it's just nVidia is on it’s own to getting Game Developers on on-board and financing these companies with a falling stock.

It is disappointing to see/hear Jenson (a CEO) speak this way to any competitor.
 
It is disappointing to see/hear Jenson (a CEO) speak this way to any competitor.
He should just take from Apple and sue EVERYONE.

I dont care about RTX. I just want the most powerful card possible. When I goto the butcher, I don't ask for him to cut me the 2nd or 3rd best.
 
-Jenson doesn’t like the ideal that 4 major Corporations in Gaming are using AMD Graphics, Microsoft with XBOX-One, Sony with PS4, Intel with integrated Graphics and AMD is self with APUs & GPUs.

This is a big problem for nVidia, the only thing that has saved them from going the way of HD-DVD, Beta, and mini-disc players is that the PC/GPU ecosystem is a relatively open standard within the APIs. It doesn't matter if you have the best product when the marketplace is using a different ecosystem.
 
Jenson just doesn’t just trash AMD, also Intel with their Graphics Division. Jenson has issues and decisions to deal with and it speaks volumes in his comments....

- Jenson is disappointed that NO other Graphics Developer is supporting Ray-Tracing, therefore no demand for Game Developers to push for it.

- Since there is a great number of Monitors supporting Free-Syn and Jenson is disappointed that he is now has supporting it too.

-Jenson doesn’t like the ideal that 4 major Corporations in Gaming are using AMD Graphics, Microsoft with XBOX-One, Sony with PS4, Intel with integrated Graphics and AMD is self with APUs & GPUs.

This reminds me of the HD-DVD vs Blu-Ray wars and the one that won was one with the most support from the Studios and Sony had a huge influence to getting the Studios on board. I don’t think Ray-Tracing is dead, it's just nVidia is on it’s own to getting Game Developers on on-board and financing these companies with a falling stock.

It is disappointing to see/hear Jenson (a CEO) speak this way to any competitor.
Dr Su, is that you?
 
- Jenson is disappointed that NO other Graphics Developer is supporting Ray-Tracing, therefore no demand for Game Developers to push for it.

I think JH likes the fact no other GPU Devs are taking advantage of MS's DirectX Raytracing. Can't forget NV helped MS make DXR so they had a jump. It allows NV to use their RTX brand as another reason to buy their cards. Look at the press after AMD announced the VII. Almost every article had lines about the VII not having DXR. Then add JH's almost sociopath like comments about the VII. Its all about marketing.

AMD probably had a good reason to skip DXR on the Vega 20.

DXR is capable "running" on current hardware according to the DirectX team. AMD could enable it on existing hardware just like NV, Sony, and MS could if they wanted. It's up to the individual hardware vendors to create their own backends for executing DXR commands. The problem comes with the performance hit the computations need. Consoles don't have close to the power, they are pretty much tapped trying to get 4k at 60fps, and NV wants you to buy new cards.

Thus newer gens will get DXR support since their computing powering will be greater or the GPU's will have specific cores set aside.

No DXR on the VII is most likely due to needing the extra performance to match the 2080 series and not wanting an "aesthetics" DX extension insanely slowing down the card and giving bad reviews. NV, even with cores specifically added for DXR/AI extensions, takes a massive hit and led to bad press. Su is smart to not waste time on the Vega series trying to perfect their backend and the possible performance hit.

Game Devs are moving to DXR since its release in October 2018. DXR is still super new but is now being used in Unity, Unreal, and two EA engines (SEED/FB). Don't know how many devs are willing to delay year(s) worth of dev on almost finished games to add DXR at the last minute. It will be added to games in early/mid-development. Look at the crap press EA/Dice got for trying to add it at the last minute.

DXR will be slow for a while. It is also why I am personally staying away from the RTX series and keeping my 1080. No reason to move yet but a lot of people have.
 
eh, nvidia needs competition, a $1k card is just ridiculous. bitcoin made this shit all possible, too bad it has gone down in flames. unfortunately the same thing is going to burn AMD
 
Dr Su, is that you?
Dude... Sony didn't have the most support. Toshiba and Sony we're neck and neck. Toshiba bailed on HD-DvD when Sony literally bought Universal Studio's away from HD-DVD. At that time it effectively crippled Toshibas ability to remain competitive.

Sony paid billions for that support. Nvidia just threatens people in the industry with non-competive tactics and now apparently insults as well.

For the record, I'm no AMD shill. I have a Ryzen 1700, but my other three systems Intel processors. I own only Nvidia branded video cards... Because they're the fastest. I don't care who makes the fastest cards, if it's AMD I buy AMD if it's Nvidia I buy theirs if it is Intel in the future I will buy their cards.

It's unfortunate that the head of Nvidia has to resort to unprofessional commentary in public but I don't think in any way that Nvidia is in trouble. Jensen is only pissed because he doesnt own the world at this point.

I hope AMD or Intel or some no name brand comes out with something that gives him a run for his money and I think AMD may be on the right track, time will tell. Until then, I'm going to sit pretty with my Nvidia cards. ..... No matter how much the 2080Ti I have pisses me off ... Lol

Edit.... Jim, sorry, I quoted the wrong person... Shit. ;)
 
AMD had to get a GPU out before Navi. It’s too far off yet.

The prices are what really are underwhelming here. And Red or Green camp we all should be put off on that.

Nvidia pushed the mid/ high end up too far with RTX new hotness or not but — hey why not since AMD has no answer.

Now that AMD has at least a stop gap card that is fast as the 2080 finally.... The price is still a bit high. IMO

I get it has prosumer appeal as it’s just an Instinct card after all. I get that 16gb of HBM is expensive. I get that Nvidia sunk a bunch of R&D into RTX. So what? -500 bucks should get us more than a 1080/ti level performance from 2 years ago.

599 would have been strong for Vega ii current new pricing considered. Less a real disruption.
 
Can't say I disagree too much, at least from a consumer perspective. AMD's offering doesn't offer significantly more performance for the dollar than Nvidia's offering. Its good that they are competing at the higher end at all though. Hopefully they can put out enough volume and the street prices end up being notably cheaper than Nvidia so we can start making GPU prices reasonable. Otherwise, meh.

The next Ryzen CPUs sound promising though. I will be upgrading my 2700X to one of those.
 
  • Like
Reactions: Fleat
like this
"no AI" lol that's like saying it's not cloud connected and that's a deal breaker. Why do we need AI in our desktop gaming GPUs again?
 
And you must be on of the geniuses who bought a DX9 card when DX11 was released.

no, when Direct X 11 came out I think I was still rockin' my Radeon 8500LE that I got back in 02-03... so how's that PhysX support treating ya?
 
I kinda feel bad for AMD because the R7 would've been a huge win for people with FreeSync monitors because they had no high end alternative but then NVIDIA killed that by throwing support behind A-Sync.
 
AMD had to get a GPU out before Navi. It’s too far off yet.

The prices are what really are underwhelming here. And Red or Green camp we all should be put off on that.

Nvidia pushed the mid/ high end up too far with RTX new hotness or not but — hey why not since AMD has no answer.

Now that AMD has at least a stop gap card that is fast as the 2080 finally.... The price is still a bit high. IMO

I get it has prosumer appeal as it’s just an Instinct card after all. I get that 16gb of HBM is expensive. I get that Nvidia sunk a bunch of R&D into RTX. So what? -500 bucks should get us more than a 1080/ti level performance from 2 years ago.

599 would have been strong for Vega ii current new pricing considered. Less a real disruption.

My thoughts exactly - I want the competition to drive down prices and instead AMD me too'd. There are of course many valid reasons to do this as a business, but it doesn't make GPU pricing sting any less for consumers.

To add to that, there have been a handful of sales for the 2080 recently around the $600 price point. My inclination says the majority of consumers would probably buy into DLSS and ray tracing over the 16GB of memory on the VII if the price points are similar. It isn't even a discussion if the VII turns out to be hard to get at MSRP and the 2080 is available and on sale.
 
Freesync doesn't work??

Wtf are you smoking, it took me about 10 seconds to enable it.

My G-sync was problematic as all hell, 8 FPS if you didn't play fullscreen, if you played borderless full etc etc 8 FPS.
I'm talking about streaming your games.. Not freesync.
 
It's funny... I call the RTX range "underwhelming" too, but I also add the word "overpriced" for free!
 
I'm feeling a little confused, why is Jensen beating his chest about a technology that has only been used to make shiny floors in places where the floors could not possibly be shiny? This drives me nuts! I want more realism in graphics not shiny for the sake of being shiny!

Consider the setting, war torn Europe, yet someone took hours to buff the floor to a mirror finish? The cars and trolleys look like they where just detailed? Where the hell is the dust and the grime? Moronic!!! A total waste of resources. Show me something that uses RTX to increase realism and then I'll be impressed otherwise, you can keep it. Oh yeah and I'm not paying $800+ dollars either. These prices are enough to make me think that I might need to find a cheaper hobby. Maybe rebuilding classic cars or perhaps making gold and diamond jewelry!
 
Time to sell my 1080Ti Wont support a Dbag like he is being ,PLUS consumers and stockholders see it too which is why the stock is down 45% time to switch right along with my Ryzen CPU AMD go go go!

Alot of Dbag CEOs that investors love. The reason NVDA stock is down is the crypto bubble broke. Stock prices have nothing to do with the sentiments of guys like us on message boards. It's all about the revenues, the profits, and the future outlook.
 
So far I rate ray tracing as underwhelming, and given the current adoption of the 20xx series it will remain underwhelming for at least a generation.

I am glad nVidia brought it to the table, but its a joke to stand up and beat your chest claiming it to be 'all that'.

I really hate this cringe worthy alpha-male marketing BS (in general, not just here). I never understood the point of making unsubstantiated claims via puerile boasts. It completely saps your credibility and makes me want to ignore you.

Edit: above directed at Jensen, not the quoted person who I agree with.
 
Last edited:
I really hate this cringe worthy alpha-male marketing BS (in general, not just here). I never understood the point of making unsubstantiated claims via puerile boasts. It completely saps your credibility and makes me want to ignore you.

Can't tell if you are addressing me as the poster, or Jensen as the marketer?
 
Time to sell my 1080Ti Wont support a Dbag like he is being ,PLUS consumers and stockholders see it too which is why the stock is down 45% time to switch right along with my Ryzen CPU AMD go go go!

I think you should show Nvidia and their "Dbag" you really DGAF and are better than them by just giving your 1080ti away! Show them their products are worthless to you!

I would be happy to take it off your hands for free and make sure it never sees the light of day again!
 
I think you should show Nvidia and their "Dbag" you really DGAF and are better than them by just giving your 1080ti away! Show them their products are worthless to you!

I would be happy to take it off your hands for free and make sure it never sees the light of day again!

Damn it, beat me too it. I don't like nVidia's BS, but they make the best, and the best is what I crave.
 
This puts a different spin on his waffling about freesync not working, from that article:

However, he does have data from his own labs that say there are problems from Nvidia’s point of view. The company tested 400 FreeSync panels and found only 12 that would turn on G-Sync automatically. The others require a manual override.

“We will test every singe card against every single monitor against every single game and if it doesn’t work, we will say it doesn’t work. And if it does, we will let it work.”

“We believe that you have to test it to promise that it works,” Huang said. “And unsurprisingly most of them don’t work.”

So rather than 12 out of 400 working, his criteria for a fail seems to be it has to enable the tech automatically? o_O
 
So it"s bad for NVIDIA to launch 1080Ti performance at 700% with DLSS and RTX, but it's good for AMD to do the same but with 16GB of VRAM? Get off your hypocritical horse.

WTF are you going on about? Where did I say any of that? Did you actually read my post, or did you view get blocked by your ultra-huge Nvidia fanboy boner?
 
So rather than 12 out of 400 working, his criteria for a fail seems to be it has to enable the tech automatically? o_O

Freesync has yet to be anything other than a shit-show. A few monitors offer near G-Sync levels of implementation and most simply do not. None match G-Sync.

He may be technically incorrect, but he's not wrong about the superiority of his company's efforts in the field.
 
No I am not trolling. I've giving everyone else back their own words. None of this is me, my numbers, my anything. This is all the information that I'm simply repeating.

When you say "above" 2080 performance ... are you just inventing this? I mean, the head of AMD said "below" but now it's .... "above?"

2080 and 2080 Ti are very very close in performance. So how far below is the performance? If it's below a 2080 and the 2080 and the 1080 ti are give or take 5 or so frames apart then does that mean it's the performance of a 1070 ti? These are statements and questions.

So, you are trolling, or are a paid shill for Nvidia, or who knows what. Got a grudge? Is what you're saying the script that Nvidia or some 3rd-party middle-man PR company have given you to try to mislead people with?

To be clear, the information that we have shows that Radeon 7 is ABOVE the performance of an RTX 2080.

And a 1070 Ti isn't what comes below the performance of an RTX 2080, so it's absolutely ridiculous to even be mentioning it at all - and you've been corrected about this multiple times.


Here's a ranking of comparable performances to make this easy to follow:

(least performance) 1070 ti < 1080 < 1080 Ti < 2080 < Radeon 7 (most performance)


The only consumer Nvidia GPU that offers more performance than Radeon 7 is the RTX 2080 Ti.


I saw some numbers today, some benchmarks ... some of those numbers are close to 1070 Ti numbers on some game reviews I saw but no one is talking about that ..... not all but some.

The only benchmark I'm aware of is this:

amd-radeonvii-3-jpg.jpg



This is not trolling ... I'm just playing the role of Devil Advocate. Lot of you dudes are getting caught up in hate, anger, hysteria, against nVidia .... just general bs emotions that will cost all of you a lot of money and less performance. Who cares what the CEO of nVidia had to say. When your in your bedroom gaming, don't let one mans words cost you better performance or money. That would be craziness.

Troll elsewhere, please.
 
Last edited:
Freesync has yet to be anything other than a shit-show. A few monitors offer near G-Sync levels of implementation and most simply do not. None match G-Sync.

He may be technically incorrect, but he's not wrong about the superiority of his company's efforts in the field.

It's upto the tech press to put his claims to the test. I very much doubt that it was 12 out of 400 that passed, sounds like inflated BS to me.
 
FYI tensor flow works on GCN cards even 290x etc..
Another thing you might find interesting is people getting Ray Tracing working on the prior generation Titan V... tensor core for 'RTX' is just a marketing name, nothing much changed.
Tensor cores are in Volta. Turing seems to literally be Volta with GDDR6 and “Raytracing cores” which are clearly not nearly as important as Nvidia stated. So far, at least.

RTX 2080 is $100 more with less performance. How exactly is it a better value again?
Huh? I can find plenty of 2080 cards with decent coolers for $700-725 on pcpartpicked. Maybe I am wrong, but 700 minus 700 typically does not equal 100.

It's upto the tech press to put his claims to the test. I very much doubt that it was 12 out of 400 that passed, sounds like inflated BS to me.
Left of it is they require a VRR range ratio of “2.4 or greater”. So the max/min variable range has to be at least 2.4.

That rules out a lot of monitors, I’m willing to bet removing that ruling, or even reducing it to 2.0, would result i a much higher pass rate. Have a window of 50-100Hz? Fails the qualification.

How many pass will at the end of the day depend the ratio of cheap to higher end monitors. Cheaper monitors with Freesync will have issues, because they’re cheap monitors.

I expect expensive monitors have near 100% pass rate outside of the 2.4 ratio.
 
It's upto the tech press to put his claims to the test. I very much doubt that it was 12 out of 400 that passed, sounds like inflated BS to me.

If your standard for passing is G-Sync, then none would pass. I think Nvidia is being generous.
 
I expect expensive monitors have near 100% pass rate outside of the 2.4 ratio.

Remember that the standard has been set by G-Sync, which has no ratio: it's simply 30Hz to whatever the max of the monitor is. The talk about ratios came about with the need to categorize the plethora of crappy "Freesync" implementations.
 
Back
Top