NVIDIA changes GeForce RTX 2080 reviews date to September 19th

ir0nw0lf

Supreme [H]ardness
Joined
Feb 7, 2003
Messages
6,404
https://videocardz.com/newz/nvidia-changes-geforce-rtx-2080-reviews-date-to-september-19th

From Videocardz:

Just last week we reported that the NDA for GeForce RTX 2080 reviews ends on September 17th. According to our sources, that is no longer true.

NVIDIA GeForce RTX 2080 and RTX 2080 Ti on September 19th
Long story short, there is a problem. Most reviewers we spoke with are experiencing difficulties in securing their samples. Some are expecting them this week, some even today. However, that would essentially mean that they have less than a week for full, professional review to be made (if September 17th was to be considered).

NVIDIA has responded to this situation by pushing back NDA date to September 19th, the same day GeForce RTX 2080 Ti reviews will go live. That said, reviewers will publish their RTX 2080 and RTX 2080 Ti reviews on the same day.

Only a handful of reviewers have the cards already. Those who do, still can’t even test them because no drivers were provided.

We will keep you updated should something change (again).

NVIDIA GeForce RTX NDA Dates
Previous NDA date

Current NDA date
September 14th Turing architecture September 14th
September 17th GeForce RTX 2080 reviews September 19th
September 19th GeForce RTX 2080 Ti reviews September 19th
 
Lets hope its for drivers, because a day before orders go out is kind of shitty.

Also hoping for some leaked benchmarks. Considering China has huge shipments of cards in, I expect that'll be soon.
 
They're saying some reviewers are getting cards without drivers even being available. I just watched an unboxing video for the EVGA 2080 so they're definitely going out.
 
Anyone knows the release date for the quadros rtx? I supose it shoud match the gaming variant release date?
 
This seems completely unlikely.

They've already posted the 17th. How can they make sure every reviewer gets the memo that the NDA has been extended to the 19th? The only way they can be absolutely sure is to withhold the drivers.

How can any review site finish decent reviews in one day?

Not to mention it's more than suspicious that the first card, the 2080, is getting pushed to the 1080ti date. Seems more like they want the 2080's numbers lost in the hype of the 2080ti. Hope not. Because for me the 2080ti price is a non-starter. The 2080 is as far as I can even dream of going on cost.
 
So much bullshit from the start with these cards. It's like the fire team is already getting in position for damage control on the regular 2080. Driver team is in overdrive mode. Fishy, real fishy....
 
Saw this on Gamer's Nexus just last night, but it's from Aug 29th. They also have a teardown video on their channel, so they've been in some folks' hands for at least 2 weeks now?


Looks like analysis of a picture from EVGA themselves.
 
Looks like analysis of a picture from EVGA themselves.
Yeah, that's what I am thinking. If they had actual hands on to that I would guess they would have taken multiple pics from various angles, back shots, etc. But that one pic looks to be enough for the guy to do his thing, which is pretty informative for those wanting to really geek out.
 
Saw this on Gamer's Nexus just last night, but it's from Aug 29th. They also have a teardown video on their channel, so they've been in some folks' hands for at least 2 weeks now?



Steve took that card down at the convention last month (PAX east, west or some shit) at the EVGA booth. Looks like Buildzoid is just going off that.

 
Anyone knows the release date for the quadros rtx? I supose it shoud match the gaming variant release date?

I spoke with our BOXX system rep the day the RTX Quadro's were released and he stated they're shooting for mid October for these to be available (I want to say he mentioned the 5000/6000 first them the 8000 shortly after that, might even be until early November) for order through BOXX, and typically they've been extremely quick about getting new hardware in the hands of their buyers. The Titan V was available for order within a day or two of the consumer release.

Early/Mid October for consumers is my rough guesstimate :D
 
Looking forward to the reviews. Still hate the price, but I'm interested in seeing just how much the new tech stuff is going to change things. I'm especially interested in the DLSS stuff. If devs actually take advantage of it, it looks like it could be a huge performance bump.
 
Not sure how thats impressive when the price increase goes hand in hand with the performance increase.
 
Looking forward to the reviews. Still hate the price, but I'm interested in seeing just how much the new tech stuff is going to change things. I'm especially interested in the DLSS stuff. If devs actually take advantage of it, it looks like it could be a huge performance bump.

Yes, definitlely. Will be interesting times moving forward. I can't wait to see what AMD can do with Ray Tracing, especially with all the compute power their cards have.
 
  • Like
Reactions: DF-1
like this
Not sure how thats impressive when the price increase goes hand in hand with the performance increase.

josephhooker.jpg
 


Some more results. BF1 and SotTR check out for me for a 1080 Ti.


I only have the Witcher 3, but I get ~60FPS with those settings and similar config, but they get 44? Also, doesn't match some other results we've seen for the 1080 Ti...
 
Haha, this is going to be good.

Its funny to hear team green calling for driver support.
 
I never thought I would see people be so impressed with 35-50% increase in performance with 71% increase in price for the founders edition. Well I guess when they make you beg for 2 years for new cards anything is going to be impressive rofl! Nvidia played this round well!

Nbidia: "They are hungry for new cards, we will feed them and they will definitely pay more our of desparation! lol
 
I never thought I would see people be so impressed with 35-50% increase in performance with 71% increase in price for the founders edition. Well I guess when they make you beg for 2 years for new cards anything is going to be impressive rofl! Nvidia played this round well!

Nbidia: "They are hungry for new cards, we will feed them and they will definitely pay more our of desparation! lol
If it was only raw increase in performance then yes, it would be pretty terrible deal.

I for one buy only 2080Ti mainly because it have Ray-Tracing and other AI stuff.
It is always great experience to be on the forefront of technology, and bad experience watching all the reviews and screenshots and gameplays with a thought 'will buy it in a year or two...' :oops:
It would make sense to wait if we had AMD card scheduled with the same tech to be released soon... here nothing seems to be happening on AMD side so... waiting is pointless.
10x0 cards didn't get much cheaper so I do not suppose 20x0 cards will, at least not any time soon.

BTW. Imagine developing AI stuff on this ting compared to anything else (let's not mention Titan V shall we? :dead:)

Yes, definitlely. Will be interesting times moving forward. I can't wait to see what AMD can do with Ray Tracing, especially with all the compute power their cards have.
Ray-Tracing need special circuitry to aid computation of rays and TPU cores to do dynamic de-noising.
As far as I am aware no AMD card have these.

With all compute stuff Vega have it might be able outperform 1080Ti if utilized properly but for one I do not believe it will and secondly it doesn't even matter because it won't be 'real time ray tracing' anyway.
 
Last edited:
If it was only raw increase in performance then yes, it would be pretty terrible deal.

I for one buy only 2080Ti mainly because it have Ray-Tracing and other AI stuff.
It is always great experience to be on the forefront of technology, and bad experience watching all the reviews and screenshots and gameplays with a thought 'will buy it in a year or two...' :oops:
It would make sense to wait if we had AMD card scheduled with the same tech to be released soon... here nothing seems to be happening on AMD side so... waiting is pointless.
10x0 cards didn't get much cheaper so I do not suppose 20x0 cards will, at least not any time soon.

BTW. Imagine developing AI stuff on this ting compared to anything else (let's not mention Titan V shall we? :dead:)


Ray-Tracing need special circuitry to aid computation of rays and TPU cores to do dynamic de-noising.
As far as I am aware no AMD card have these.

With all compute stuff Vega have it might be able outperform 1080Ti if utilized properly but for one I do not believe it will and secondly it doesn't even matter because it won't be 'real time ray tracing' anyway.

Your why marketing exists and why it succeeds. Also just so you know Ray Tracing has been done for years, tensor cores are not needed to do it.
 
If it was only raw increase in performance then yes, it would be pretty terrible deal.

I for one buy only 2080Ti mainly because it have Ray-Tracing and other AI stuff.
It is always great experience to be on the forefront of technology, and bad experience watching all the reviews and screenshots and gameplays with a thought 'will buy it in a year or two...' :oops:
It would make sense to wait if we had AMD card scheduled with the same tech to be released soon... here nothing seems to be happening on AMD side so... waiting is pointless.
10x0 cards didn't get much cheaper so I do not suppose 20x0 cards will, at least not any time soon.

BTW. Imagine developing AI stuff on this ting compared to anything else (let's not mention Titan V shall we? :dead:)


Ray-Tracing need special circuitry to aid computation of rays and TPU cores to do dynamic de-noising.
As far as I am aware no AMD card have these.

With all compute stuff Vega have it might be able outperform 1080Ti if utilized properly but for one I do not believe it will and secondly it doesn't even matter because it won't be 'real time ray tracing' anyway.

Well so basically what you said is you are paying 1200 for bleeding edge whether it gives you playable frame rates or not in ray tracing. I am not going to tell you how to spend your money but ray tracing would be the last thing I pay that much money for on first gen product when its going to require I drop my resolution to 1080p to get playable frames. I am never paying 1200 bucks for that lol. But Hey if you enjoy it all power to you.
 
after seeing a few tech demo's of ray tracing, resolution go f*ck itself, I fully plan on paying through the nose to be able to turn on RT. But I'm going to wait till more games support it.so probably later next year.
 
after seeing a few tech demo's of ray tracing, resolution go f*ck itself, I fully plan on paying through the nose to be able to turn on RT. But I'm going to wait till more games support it.so probably later next year.
You can upscale using neuron networks (DLSS) and because everyone will be complaining about that it is likely they will implement that.

I am not worried about resolution at all.
 
Your why marketing exists and why it succeeds. Also just so you know Ray Tracing has been done for years, tensor cores are not needed to do it.

RT doesn’t use Tensor cores, it uses RT cores which are completely different. You’re still correct you don’t need them which is a good thing but AMD or Intel still would need some kind of implementation. I think it’s important to note they are seperate since being able to run DLSS and RT at the same time is ideal.

I am curious how well DLSS would do with RT though since everything with RT gets skewed/warped on curved surfaces like it would in real life. Not as predictable... but I know some games are launching with both.

Well so basically what you said is you are paying 1200 for bleeding edge whether it gives you playable frame rates or not in ray tracing. I am not going to tell you how to spend your money but ray tracing would be the last thing I pay that much money for on first gen product when its going to require I drop my resolution to 1080p to get playable frames. I am never paying 1200 bucks for that lol. But Hey if you enjoy it all power to you.

There’s at least two games I’ve heard running faster than that. BFV at 1440P 40-50fps and DICE thought they could get at least 30% more out of it and some war game running 4k/90+ fps. I forget name of it but the video is floating around somewhere. I’d give them a chance...

3 days!

Also this devblog is really well put together imo. It goes into the shader cores, rt cores, tensor cores, ect. It has a link to a full rt paper which I haven’t read yet. The CUDA cores did get a performance bump which makes sense where the extra performance is coming from we see in “leaks”.
https://devblogs.nvidia.com/nvidia-turing-architecture-in-depth/
 
Last edited:
You can upscale using neuron networks (DLSS) and because everyone will be complaining about that it is likely they will implement that.

I am not worried about resolution at all.
I'm not 100% worried about be able to use DLSS or not, it is another thing that has to be supported (driver wise I believe, developer has no requirements here).
 
RT doesn’t use Tensor cores, it uses RT cores which are completely different. You’re still correct you don’t need them which is a good thing but AMD or Intel still would need some kind of implementation. I think it’s important to note they are seperate since being able to run DLSS and RT at the same time is ideal.

I am curious how well DLSS would do with RT though since everything with RT gets skewed/warped on curved surfaces like it would in real life. Not as predictable... but I know some games are launching with both.



There’s at least two games I’ve heard running faster than that. BFV at 1440P 40-50fps and DICE thought they could get at least 30% more out of it and some war game running 4k/90+ fps. I forget name of it but the video is floating around somewhere. I’d give them a chance...

3 days!

Also this devblog is really well put together imo. It goes into the shader cores, rt cores, tensor cores, ect. It has a link to a full rt paper which I haven’t read yet. The CUDA cores did get a performance bump which makes sense where the extra performance is coming from we see in “leaks”.
https://devblogs.nvidia.com/nvidia-turing-architecture-in-depth/

That’s the whole point. By the Time they learn to optimize this shit next gen will be here lol.
 
Back
Top