To be fair to NVIDIA, the issue with ray tracing isnt the hardware, its lazy ass developers.
Lazy developers not prioritizing for a micro % of the market? Nvidia obviously worked hand in hand for BFV. Other developers aren't throwing handfuls of $$ and time at something that is undeveloped and evolving when it doesn't ad to their bottom line.
 
My first thought seeing this was ... their performance looks like its higher then your #2 card... so the 2080 is lousy is what your saying.. the one your ending up at retail around a grand+ right? That card?
 
The VII sounds like a great part to me. The only thing I wasn't in love with was the price. 50-100 bucks cheaper would have made it a killer defato purchase for basically everyone but the most die hard NV fans. Hopefully AMD realizes that and adjusts pricing quickly.

2080 performance... at lower power usage. Sounds good to me. The Tensor cores are really not required at this point. I'm sure game developers will find more uses for them over the next handful of years. But really if tensor cores don't find their way into AMDs lineup with NAVI. Then real time ray tracing with tensor hardware is DOA. If its not in the next gen PS5 / Xbox gen it will always be an after thought for developers. Frankly I'm starting to doubt that the next gen consoles could have fast enough tensor parts to really make local real time ray tracing a reality. IMO the only people that are likely to see Real time tracing will be the streamers. In fact with all the push from game developers to move to streaming... I wouldn't be shocked if the industry as a whole is ok with nothing but the most insane setups being capable of doing 60fps real time tracing at all. It makes the market for streamed games with that and other eye candy jacked to 11 all the more attractive to the majority of the market. (its a big selling feature for streaming is what I'm saying if only 0.1% of PC gamers can even turn the feature on without going into slide show mode.)

Although I would have loved to hear about 5 Navi cards, and 10 new Ryzen parts and R9s ect. AMD is being cautious hopefully that pays off. VII if nothing else gives them experience shipping 7nm parts. Hopefully it means they will be able to slot more performance into mid range navi parts mid year.
 
Not the classiest of Q&A sessions from Jensen, but I'll put the politics aside (cause who cares... just give me the tech) and wait for the "with DLSS" ray tracing benchmarks (from [H], of course) before taking a stance on the whole 20* series.
Kudos to Nvidia for cranking out new hardware before the games were ready for it. They probably knew they'd get shit on by the haters for that, but did it anyway. "Courage" ? ;)

The pricing of the higher end cards is high... but someone's gotta pay for the R&D so the next gen satisfies all the unreasonable competitive gamers that expect RTX's introduction to the world to run at the same speeds as without RTX.
And.. if what Nvidia says about DLSS is true, maybe acceptable performance is not too far away?

What I really don't get is when people say they can't see the difference with ray-tracing on / off. It's pretty damn obvious, even partially implemented. I like eye candy and FPS. You can have both with these cards, but maybe not at the same time yet.


PS... I have an ASUS MG278Q monitor which is on the list of supported, so I'm not exactly disappointed with the CES news from Nvidia. Can't wait to check out FrankenSync with my 1070!
 
Last edited:
was gonna say the same.

if you're number 1, show some class and act like you've been there. His reaction speaks of fear. Not confidence.

This has always been Nvidia's foundation though. As soon as they were on the scene in the late 80's to early 90's their number one goal was to be the only player in town, by any means necessary.
 
I think the following sums it up nicely (from the review on the home page):

"As far as NVIDIA Ray Tracing goes in Battlefield V, it’s not ready for the masses, it’s not ready for mainstream gaming. The game looks fine without it, and for us, NVIDIA Ray Tracing adds nothing to the BFV multiplayer gameplay experience, except distractions and a reduction in performance."

Great review, guys!

As for leatherjacket and his take on the world, let's just say that right about now there are likely a whole lot of gamers out there who might buy AMD's "underwhelming" product just to see him and his cut off their noses to spite their faces.

nvidia weforce.png
 
Last edited:
While reddit can be a trash bin of sorts, I know there have been tons of RTX reports of "slowness" and "massive reductions in FPS" even at low resolutions and settings. Many disappointed folks out there. Maybe this is another Nvidia lemon series (?).
 
"And if we turn on ray tracing we'll crush it,"

Because gamers are lining up to play BFV at 20fps!

My highly overclocked 2070 runs bf5 2560x1080 at ultra everything including ray tracing just fine. In single player, which is fine for me since I don't play anything else.

Don't forget dlss......that still hasnt made an appearance in a game yet.

Well it is in final fantasy 15. But only if you are gaming at 4k. DLSS was a part of my decision making process though and if Nvidia fcks this up and doesn't deliver I'm going to be pissed. And currently I see nothing that makes it appear that they are going to deliver.

Brags about DLSS... Put up or shut up. Where are the games that support this shit.. really? My 2070 is waiting.

No kidding. "DLSS will crush AMD!" For all five people that are still playing final fantasy 15 in 4k.
 
When you have a truly stand-above-the-rest product, you tout it's accomplishments, and nothing more. It should be able to stand on its own merits. Let the benchmarks and unbiased reviews show your products superiority.

When you have to bad-mouth a competitors product, you're worried about what they have brought to the table.

Jimmies have been rustled :D and I'm 110% cool with that as a current member of team green with no clear upgrade path.

Good on AMD, can't wait to see how the VII truly stacks up against that RTX lineup in real gaming benchmarks. Paired with a solid higher end free sync monitor, it very well could be a killer combo for those looking to finally step up both their monitor and GFX card game to compete with the higher end Nvidia/Gsync lineup.
 
Last edited:
Ray tracing is dead already. All the consoles are going to use AMD , so ray tracing will be simply , but , but it has ray tracing. Im pretty sure most of the dev studios could give two fucks about it.
I believe Ray-Tracing belongs on the CPU or at the very least an ASIC on a separate card that isn't occupying the same space as the GPU. Whatever Nvidia is doing it isn't working. AMD though could have announced something about how they plan to handle Ray-Tracing and they didn't. Their new Ryzen doesn't even have the finalized clock speed and they have a name for their new $700 GPU. The whole presentation was underwhelming except when she showed the actual next gen Ryzen which is going to be a chiplet design. That did give me boner. It does make me wonder if AMD is doing the same with their 7nm GPU since the idea is that smaller chips are less likely to have defects and will clock higher?
 
Interesting.

Not having RTX? That is not a negative in my world. I would be interested to see how many people that own the new RTX series, percentage wise, actually have it activated on the new BFV and other titles supporting it. I was planning on purchasing a 2080 ti but took a big pass when I saw the price increase, and then like a true [H]'er I waited for benchmarks and was extremely dissapointed in its performance per $. I ended up getting a placeholder RX 580 for the near term until 7nm (AMD) or 10nm (Intel) matures one or two generations. However, had I purchased the 2080TI I don't think I would have been happy running it at 1080p with low bouncing frame rates for a few visual effects.

Turn on DLSS? ... how? In what game? FFXV benchmark is not playable as far as I know. This is a very interesting technology, but without widespread adoption (or any?) does not have a place at the table to bash competitors.

Trash talking Freesync as a technology is just comical, it is every bit as powerful as Gsync without the additional cost. I have always been a fan of open source, it gives the potential to improve technologies an even footing for all developers. People who don't do research before buying a good piece of equipment I do not feel sorry for. This gives me flashbacks of the GPP and definitions of "quality" and yet we have GTX 1030s listed as "affordable discrete gaming graphics cards", oxymoron. A low priced freesync monitor should not be looked at credibly; anything that is "inexpensive" and "gaming" needs to be thoroughly researched to deliver an type of gaming creed.

I think dismissing a juggernaut like Intel when it comes to discrete graphics is foolish. They should be taking them very seriously; their 10nm process will be highly efficient and every bit as good as a 7nm one. I would also believe that their ability to reverse engineer Nvidia and (already known) AMD chips to have a very good starting point is highly viable.

I have owned AMD and Nvidia both, I don't find myself swayed by anything other than business practices. Currently, Nvidia has not done anything lately to earn my business; from ethics to price/performance.
 
He thinks the $350 2060 is a great deal, and I have to disagree with him. A $250 2060, that would have been a real deal. 6GB VRAM for $350 cards is ridiculous and insulting. An 8GB 2060 at $300, that would have been awesome. $350 for an 8GB version, still good. BUt I'm sorry 6GB really limits the lifespan of this card.

2060 at $250 might make AMD quit selling GPU to consumer altogether.
 
I'm still waiting on a mid-range announcement from AMD. The $699 isn't going to work with my budget. I'm wanting something for $400 or less.

yeah that was one of my questions for AMD as well when kyle asked if we had any questions we'd like asked.. hopefully they gave him an answer but my guess is the only other 7nm card will be some where between the vega 64 and "radeon 7" then just lower the 64 and 56 to 350 and 250 with the RX 590 sitting at 200-225 then they'd pretty much have nvidia covered. but that's a pipe dream that'll probably never happen with the state of prices right now.

Ray tracing is dead already. All the consoles are going to use AMD , so ray tracing will be simply , but , but it has ray tracing. Im pretty sure most of the dev studios could give two fucks about it.

i don't think ray tracing is dead yet, i wouldn't be surprised if navi ended up supporting some form of ray tracing since it'll be a new architecture but i think nvidia jumped the gun with RTX without making sure there would a suite of games supporting on launch.
 
The VII sounds like a great part to me. The only thing I wasn't in love with was the price. 50-100 bucks cheaper would have made it a killer defato purchase for basically everyone but the most die hard NV fans. Hopefully AMD realizes that and adjusts pricing quickly
AMD probably is grossing 100-150 bucks on the card.

Like, 400-450 dollars to make it, sell to AIB for $550-600. My WAG.
 
Im fairly confident that AMD has no plans on supporting ray tracing. Their answer is neither yes or no. Looking at it in its current state. Its a disaster. Its slow, it does not play smooth in BFV because of DX12 not to mention its the only game at the moment.
The new consoles are going to use AMD. Its just not looking favorably for NV at the moment. Im not clouded by hatred or being a fanboy of any. I use both camps and don't really care if it fails or succeeds. Just calling it the way I see it. It does not look that well.
 
It sounds like JHH is a toolshed and how dare amd beat him not only to 7nm, but he must also admit VICTORY and adopt freesync! also how dare amd/devs toss shade on he beloved ray-tracing tech/dlss that no one is going to use till the consoles support it with their amd hardware in the 2020's.

from what i see amd has places to go with 7nm and nv is pushing the limits of 12nm production. i really wonder whats the yields/cost on a nv 12nm wafer making 2080's vs the yields/cost on a amd 7nm wafer full of radeon 7's ?
 
Last edited:
“Now we’re ready, and it’s called 2060,” Huang said. “[It has] twice the performance of a PlayStation 4 and it’s only $350."

Fuckin what? I think thats my new favorite jackass CEO quote.

 
Pretty crazy that some are still saying AMD has no competitive offer at that price point, because im pretty sure that my Vega 56 flashed to 64 is on par with 2060 in both performance and price :D
 
Why does DLSS need to be trained? Why is it taking so long for nVidia to train DLSS games. You'd think they could do it for every single game out there without help from the game's developers.
 
He thinks the $350 2060 is a great deal, and I have to disagree with him. A $250 2060, that would have been a real deal. 6GB VRAM for $350 cards is ridiculous and insulting. An 8GB 2060 at $300, that would have been awesome. $350 for an 8GB version, still good. BUt I'm sorry 6GB really limits the lifespan of this card.

It's a foreign concept to him that a person would rather call an Uber or risk driving themselves to a hospital then calling an ambulance and eat a four figure medical bill.
 
Pretty crazy that some are still saying AMD has no competitive offer at that price point, because im pretty sure that my Vega 56 flashed to 64 is on par with 2060 in both performance and price :D

The only price point that matters for marketing purposes (and for me since I'm at 4k) is the halo price point. Everyone makes makes broad assumptions about your entire line up based on whether or not you are winning the halo wars.

So, to the typical consumer, unless AMD is beating the Titan RTX, nothing in their entire lineup is worth buying at any price.

That's just the way the stupidity of the market works. Everyone should be used to it now.

Have the best Titan-classed product, or go home without selling even mobile budget GPU's
 
Last edited:
The only price point that matters for marketing purposes (and for me since I'm at 4k) is THE halo price point. Everyone makes makes broad assumptions about your entire line up based on whether or not you are winning the hslo wars.

So, to the typical consumer, unless AND is beating the Titan RTX, nothing in their entire lineup is worth buying at any price.

That's just the way the stupidity of the market works. Everyone should be used to it now.

Have the best Titan-classed product, or go home without selling even mobile budget GPU's
Dunno about titan, but certainly wouldn't surprise me if that was true up to 2080ti or at least 2080. That said, i doubt there are that many who purchase based on what the top tier perf is – most will follow their friend's recommendations within their budget, which includes recommending 3-gen old cards because they're "just as good" (true to a point, but a flawed statement none the less).
 
Well, he does have a point. It really doesn't have anything new and the performance is still below that of a 1080 ... around the performance of a 1070 ti ... maybe?

For those of you that don't truly really know your numbers and there are a surprising amount of you on here, I witness it every single day and or you rely on 2nd hand hear-say ... the 1080 ti is actually 2% faster than the nVidia 2080. So AMD says this new card is just below a 2080 ... that put's it at about a 1070 ti ... right?

Proof - https://gpu.userbenchmark.com/Compare/Nvidia-RTX-2080-vs-Nvidia-GTX-1080-Ti/4026vs3918

Honestly, don't get caught up in ignorance, mob mentality or hate ... allow your money to be smart and unencumbered by emotion. I say this because I know for a fact many of you will drop $700 to $800 on this new AMD 7 card out of hate or ignorance and probably both when you could have just bought a used 1070 for $200 or $250 dollars or a 1080 Ti for $500 or so.

Like, seriously .... don't go into any of this AMD video card stuff with grand expectations. Temper that shit with reality and especially the past.

$700 is just crazy talk. You can get a used 1080 ti for $500 - $550 ... I see them all day everyday. And the 1080 Ti is a good performance clip away from the 1070 Ti.

Had AMD said it would perform at the level of a 2080 Ti ... for $499 or $599 .. I would sell my RTX 2080 Ti in a heartbeat lol.
 
“Now we’re ready, and it’s called 2060,” Huang said. “[It has] twice the performance of a PlayStation 4 and it’s only $350."

Fuckin what? I think thats my new favorite jackass CEO quote.

That comment of Jen Huang's reads to me like a conniving sales pitch.

To compare the value of an RTX 2060 to a PS4, the RTX 2060's cost as a % of a total PC tower's cost has to be compared to a PS4's GPU's cost as a % of a total PS4 system. Or, the cost of an entire PC tower containing an RTX 2060 and including keyboard and mouse has to be compared to the cost of a PS4. The PC tower containing an RTX 2060 will probably cost around 3 times as much as the PS4.


Well, he does have a point. It really doesn't have anything new and the performance is still below that of a 1080 ... around the performance of a 1070 ti ... maybe?

For those of you that don't truly really know your numbers and there are a surprising amount of you on here, I witness it every single day and or you rely on 2nd hand hear-say ... the 1080 ti is actually 2% faster than the nVidia 2080. So AMD says this new card is just below a 2080 ... that put's it at about a 1070 ti ... right?

I think your calculations are off:

A GTX 1070 TI isn't just below the performance of an RTX 2080. A GTX 1070 Ti is just below the performance of a GTX 1080, and an RTX 2080 is 22% faster than a GTX 1080. So, just below an RTX 2080 would make AMD's new GPU significantly faster than a GTX 1080, with a GTX 1080 being 7% faster than a GTX 1070 Ti. An RTX 2080 is 30% faster than a GTX 1070 Ti.

I think of "just below" an RTX 2080 to be maybe 2 - 8% slower, not 20%+ slower.
 
For those of you that don't truly really know your numbers and there are a surprising amount of you on here, I witness it every single day and or you rely on 2nd hand hear-say ... the 1080 ti is actually 2% faster than the nVidia 2080. So AMD says this new card is just below a 2080 ... that put's it at about a 1070 ti ... right?

Proof - https://gpu.userbenchmark.com/Compare/Nvidia-RTX-2080-vs-Nvidia-GTX-1080-Ti/4026vs3918
Well, the RTX 2080 is actually faster from what other sites are posting.
This synthetic benchmark (yes, I know, not real-world, but still a good general rule of thumb) shows things a bit differently:

https://www.videocardbenchmark.net/compare/GeForce-RTX-2080-vs-GeForce-GTX-1080-Ti/3989vs3699

GTX 1080 Ti - 14169
RTX 2080 - 15580

So that would make, on average, the RTX 2080 a little under 10% faster.
This is also just in gaming/graphical/synthetic performance as well - in HPC and compute, the RTX 2080 is quite a bit faster.

Still, though, I totally get what you mean and do agree with you. :)
Also, the lack of additional VRAM for the RTX 2080 is a bit of a deal breaker - 8GB on it vs 11GB on the GTX 1080 Ti.
 
It is my opinion 2060 should be no more than $299, with $249 preferred.

Performance at the same price point should go up every generation. What I paid for $250 of performance last generation should buy me a certain percentage of performance more in the next generation, at the same price.

I remember when a "xx60" card was targeted for the "sweet spot" or below - https://www.hardocp.com/article/2015/01/22/msi_geforce_gtx_960_gaming_video_card_review

https://www.hardocp.com/article/2012/09/12/asus_geforce_gtx_660_directcu_ii_video_card_review/1

https://www.hardocp.com/article/2016/07/19/nvidia_geforce_gtx_1060_founders_edition_review
 
Last edited:
Once again, everyone thinks of the now and not the later. I do think this card is underwhelming and I think AMD knows it. They have a new architecture in the works and we all know they've invested an obscene amount of money and time to go 7nm even if tsmc is doing the grunt work.

So they get 7nm ready now and prove they can do it. Learn from it so when the next gen is ready for production it will not be held back by the 7nm transition. Oh and by the way, they get to make at least some of their money back now with this card.

Its a proof of concept part that they took a little further to make some money back.

Would be nice if nvidia pulled their heads out of their asses though. I think we'd all prefer a healthy market.

If you're thinking I'm an AMD fanboy, I'm currently running a nvidia gtx 970 in my rig. I just buy whatever seems the best value for my needs at that time.


I'm right with you brah .... I'm only a fanboy of performance. I could give a rats ass less if my video card had AMD or nVidia on it. I just go where the performance is. These kids and these companies and everyone else can bitch, lie, gripe and complain. They can take the drama with them. And I hope they do. I have no problem saving my money, saving up, buying the best, having the best. It allows me not to have to deal with any of this petty crap.
 
Yes, to be fair, Nvidia doesn't have control over how game programmers make their stuff, but they're also asking people to invest a lot of money on what is essentially a leap of faith that the product they are buying today is going to ultimately deliver the features that they want to use.
And now AMD asks you to spend a lot of money to get nothing, no ray tracing, no AI upscaling. Zero, nill. Who is the better value now?
 
Back
Top