Nvidia Killer

Status
Not open for further replies.
Chiplets are an intriguing idea. Theoretically you could move ROPs, memory controllers and cache off to a separate on-package die like AMD did with Zen 2. You would also need to move the "front end" that talks to the CPU. It'll be slower and use more power than a big fat chip though.

Why should small CPU dies with single digit performance increases cost more than GPUs 3-4x the size that substantially improve performance each generation ?

Chiplet designs are early days. It doesn't have to be slower at all. I think AMD has sort of proven that with Zen2. It's faster generation to generation on a magnitude we haven't seen in a long long time.

In GPUs yes the challenges are different. Of course both AMD and INtel are planning chiplet versions of their GPUs cause they will be slapping them on their CPUs. Intel is also potentially using their 3D Foveros chiplet stacking tech. Which if they pull it off is for sure better then what AMD is currently cooking and really will no different then one big hunk of silicon. The chiplets would be stacked and connected with traces which would actually be shorter then what they are currently using connecting one CCX to a second one.

As far as multiple chiplet GPUs go... I agree that is a different ball of wax. But I think all the players are looking at ways to make it happen. NV has already built chiplet research chips for AI stuff. Which are sharing massive amounts of data. AMD is no doubt at the least going to have a navi chiplet to slap on the console parts. Presumably their controller chip is going to have to address the GPU as well. Intel is a bit of an unknown but based on the little bits they have released. I think its a safe bet to assume they have some form of chiplet as well. Their integrated situation will greatly improve if they really do have a next gen using 3D stacked CPU and XE graphics. I suspect their cards are also going to be chiplet based somehow. Perhaps it will be as simple as higher end cards having an extra bit intended to be used for things like RTX.

As for the cost of GPUs... I don't know your right a lot of people seem ok with spending north of a grand on a GPU. I guess I'm old school but I remember when I could actually afford a 3DFX card and a second one for SLI didn't cost more then a server chip. We are literally at a point where you can pick up an Epyc 7352 (7nm 24 core 48 thread) server processor for less then the cost of the highest end consumer GPU. I consider that insanity myself. It basically means I could build a 24 core 128gb solid mid range server... for the cost of a decent gaming PC with a 2080ti in it. Seems stupid to me. But who am I to judge people willing to spend that money. But I mean I do also laugh at yuppies (do we still call em that right?) driving around in their expensive little german cars and more often stupidly oversized and over priced minivans (SUVs I think the kids call em). :)
 
AMD, out of necessity, has had to get the best of both worlds out of their one architecture.
And Nvidia does it just for the lolz? They do it to consolidate resources and save money. Since a very large majority of those 10+billion transistors have the same function, making lots of dies that have slight differences doesnt make sense, AMD or nvidia.
 
Why should small CPU dies with single digit performance increases cost more than GPUs 3-4x the size that substantially improve performance each generation ?
Yeah, add to that BOM for a fans, hat sinks, 8-16gb of gddr6. A 7700k (Intel doesn't publish #'s, so it's hard to find) is around 3 billion transistors. A 2080ti is around 18billion. Just imagine not only the difference in size costs, but also how much more possibilities of a wasted die from that size. And again, you're talking about a full board, not just the die. I'm actually surprised they sold for so cheap for so long. Heck, when AMD decided hbm was their best option they were paying more for that than most CPUs cost to produce.
 
Having a standard chiplet for encoding/decoding, I/O, memory cache controller, display controller, buffers, fabric, maybe standardize Ray Tracing accelerator that will work consistently with a broad range of cards could bring down costs and keep the actual GPU with the Graphics Engine with varying number of shaders depending upon skew. Memory controller supporting DDR 6, HBMx etc. Seems like having consistent ray tracing between the cards for lighting would bring dynamic raytracing to gamers and a consistent target for developers to go by. I don't think RTX is that useful yet - what would the developer shoot for, 2060 RTX ability? 2080? 2080 Ti?

Games normally have a hell a lot of ray tracing, lightmaps, shadow maps based off of ray tracing baked in. Months if not years of some heavy use of ray tracing rendering is used to make games look really good. RTX does not even come close to duplicating the quality and really can't. It's advantage is that it is real time, dynamic and can add another quality layer such as reflections, gi etc. to the game.
 
Vulkan is going to "clone" DXR features:
https://www.khronos.org/assets/uplo...-siggraph/Vulkan-01-Update-SIGGRAPH-Jul19.pdf

upload_2019-8-14_13-47-3.png


So DirectX and Vulkan will have similair features.
Game enignes are implmenting raytracing.
Developers wants raytracing.

If you do not want raytraing...you are SOL...enjoy PONG...the gaming wolrd is moving on.

(And yes we know "DXR doesn't matter"...untill AMD supports it in hardware :rolleyes:)
 
This is the case for every single large GPU AMD has ever released (meaning, since they bought ATI). Every single one has been compute heavy and targeted commercial applications first. And every single one was worse for gaming for its die size and power draw than a comparable Nvidia part, and since the release of the GTX680, Nvidia has built dedicated high-end gaming GPUs that have smoked AMDs compute-heavy large GPUs. They still do today.

Big Navi will literally be AMDs first. Ever.

Ummm 5870 and GTX480? GTX480 was more compute focused 5870 not so much.... (5870 was way more power efficient is wasn't even funny)
Oh and Volta was released as a Titan V for comsumers......
The "Efficient" archs are the ones not "compute" focused... However compute archs are efficient for compute tasks.
Also AMDs "big chips" are always smaller than Nvidia's "Big chips" if that makes any difference. (Volta was massive!)

Both AMD and Nvidia have done this before, nothing new.
Also Nvidia has the resources to build a compute and a gaming GPU, AMD doesn't and had to compromise with GCN. GCN has some legs though, same can't be said for Kepler...
 
Vulkan is going to "clone" DXR features:
https://www.khronos.org/assets/uplo...-siggraph/Vulkan-01-Update-SIGGRAPH-Jul19.pdf

View attachment 180790

So DirectX and Vulkan will have similair features.
Game enignes are implmenting raytracing.
Developers wants raytracing.

If you do not want raytraing...you are SOL...enjoy PONG...the gaming wolrd is moving on.

(And yes we know "DXR doesn't matter"...untill AMD supports it in hardware :rolleyes:)
You do know dx12 was copied from vulkan right? AMD had a large part in it's design while working together with Microsoft. Also, it already has dxr extensions, which is where they are taking them from and trying to maintain compatibility (otherwise hardware wouldn't work)... Not sure what you're getting at about this. So Gilman is adding a feature, great... What's that prove? It doesn't magically make what (little) games available any better. It doesn't make a $1200 GPU any more enticing to me. Thanks for the update that an extension is moving to a supported feature, this is how opengl (Kronos) and now vulkan do things... For oh, I don't know... Since early 1990's. Great that you're catching up with this technology and concept, next we can introduce you to windows 95.
 
Just run my 1440x2560 @ 144hz, freesync, 10bit color, HDR, HDMI 2.1, DP2.0, quiet cooling, low TDP, not longer than ~8-10", and cost somewhere under $300.

Really. You'll kill Nvidia if you do that.
 
It is code named "NV Killer" not "Matcher". The 5700XT is being held back by its memory speed(artificially it seems) and is knocking on the RTX2070s/2080 door. So where would that place the 5800XT? Now where would that place the "NV Killer" 5900XT?
If it comes out 6 months from now, that will be a full two years after the RTX2080ti. I don't understand why it would be so unfathomable to some why a brand new gaming oriented architecture wouldn't best the top bar. In fact it should be expected or we are going backwards.
5900XT? I had one of those.
upload_2019-8-14_9-57-57.png

Ummm 5870 and GTX480? GTX480 was more compute focused 5870 not so much.... (5870 was way more power efficient is wasn't even funny)
Oh and Volta was released as a Titan V for comsumers......
The "Efficient" archs are the ones not "compute" focused... However compute archs are efficient for compute tasks.
Also AMDs "big chips" are always smaller than Nvidia's "Big chips" if that makes any difference. (Volta was massive!)

Both AMD and Nvidia have done this before, nothing new.
Also Nvidia has the resources to build a compute and a gaming GPU, AMD doesn't and had to compromise with GCN. GCN has some legs though, same can't be said for Kepler...
So? The 8800 GTX was "compute focused" and it was top-dog in the gaming market for nearly 4 years.
Just run my 1440x2560 @ 144hz, freesync, 10bit color, HDR, HDMI 2.1, DP2.0, quiet cooling, low TDP, not longer than ~8-10", and cost somewhere under $300.

Really. You'll kill Nvidia if you do that.
See, this is one of the biggest issues in the market right now. People want AMD to crush NVIDIA, but they want AMD to do it at an unrealistic price point. Remember just recently how outraged people were at AMD's initial pricing on the 5700 and 5700 XT? You can't severely undercut the competition when you're already the underdog in the race.
 
Vulkan is going to "clone" DXR features:
https://www.khronos.org/assets/uplo...-siggraph/Vulkan-01-Update-SIGGRAPH-Jul19.pdf

View attachment 180790

So DirectX and Vulkan will have similair features.
Game enignes are implmenting raytracing.
Developers wants raytracing.

If you do not want raytraing...you are SOL...enjoy PONG...the gaming wolrd is moving on.

(And yes we know "DXR doesn't matter"...untill AMD supports it in hardware :rolleyes:)


I don't think you get it.

EVERYONE... every Gamer would like to see ray tracing in the future. They just are not going to spend $800+ to receive a massive performance hit with broken "RTX On" technology. Those are the two points people are making Factum, that you keep skirting around: Ray-Tracing in games, and hardware they can deliver ray tracing.

Nvidia's "RTX On" is broken and will never get better in ray tracing games. No matter how many games come out next year, or the following, Nvidia current implementation will still be laughed at as a gimmick because Jensen tried to push broken RTX On onto the public (w/a strait face), as a selling feature.

I'll say that again, Jensen actually tried to sell a broken feature, as a positive for Turing.



Additionally, Navi10 can already do ray-tracing, but it doesn't work smoothly, so AMD has no reason to lie to themselves and try selling it as a "feature". They said they will offer ray-tracing when it doesn't impact the end-user's gaming performance.

Xbox, PlayStation and AMD all said they plan on supporting ray-tracing... we can only hope that Nvidia decide to support it too, someday. Instead of trying to pass off some their ad-hoc gimmick. And.... it is a gimmick, because Turing can't do real time environmental ray tracing in games, it can only do partial rays, carefully placed by Nvidia's team.




As a Gamer I am not paying for Ray-Tracing... just like as a Gamer I won't pay for Shadows. There is no premium to be payed for ad-hoc.
 
Last edited:
I don't think you get it.

EVERYONE... every Gamer would like to see ray tracing in the future. They just are not going to spend $800+ to receive a massive performance hit with broken "RTX On" technology. Those are the two points people are making Factum, that you keep skirting around: Ray-Tracing in games, and hardware they can deliver ray tracing.

Nvidia's "RTX On" is broken and will never get better in ray tracing games. No matter how many games come out next year, or the following, Nvidia current implementation will still be laughed at as a gimmick because Jensen tried to push broken RTX On onto the public (w/a strait face), as a selling feature.

I'll say that again, Jensen actually tried to sell a broken feature, as a positive for Turing.



Additionally, Navi10 can already do ray-tracing, but it doesn't work smoothly, so AMD has no reason to lie to themselves and try selling it as a "feature". They said they will offer ray-tracing when it doesn't impact the end-user's gaming performance.

Xbox, PlayStation and AMD all said they plan on supporting ray-tracing... we can only hope that Nvidia decide to support it too, someday. Instead of trying to pass off some their ad-hoc gimmick. And.... it is a gimmick, because Turing can't do real time environmental ray tracing in games, it can only do partial rays, carefully placed by Nvidia's team.




As a Gamer I am not paying for Ray-Tracing... just like as a Gamer I won't pay for Shadows. There is no premium to be payed for ad-hoc.
Issue number 2 with the video card market revealed: people expect shiny new graphic technology to come with zero performance cost.
 
I can't believe people are still talking about ray tracing. Any meaningful implementation is still many years out. Best case, next gen consoles will support ray tracing about as well as the XBox One and PS4 supported 4k at launch. We will be lucky if next gen is even capable of good 4k performance let alone a couple ray traced puddle gimmicks in two games.
 
Issue number 2 with the video card market revealed: people expect shiny new graphic technology to come with zero performance cost.
No, I leave that to people with mine to burn. I'll buy it when it's more mature, or if it's a feature *I* can't live without. If you can't, then get it.. if you can, then don't. It's really that simple. I don't know why everyone argues so much, some people have different opinions and ideas of what they want. I don't judge you for having yours, so stop judging me for mine. The industry will continue to move forward, I will keep buying mostly mid/low until something comes along I feel is worth spending more. I am not expecting 2080ti performance for $300, I am expecting more performance than I paid for it last time I upgraded though.
 
BTW how can anyone recommend Navi10 when it will be obsolete next year when even PS5/Scarlett will support RT?
 
I think he meant money. but thanks for giving him a way out. :D:D
Meh, it's easy to see it either way I guess, but when I think of paying the price for a feature that's already physically there, I tend to think the price you pay to use it, not cost associated with producing/purchasing it since that's already a done deal ;)
 
Last edited:
BTW how can anyone recommend Navi10 when it will be obsolete next year when even PS5/Scarlett will support RT?

If you can get one of those deals on the regular 5700 that prices it close to a 1660Ti, then it's a significant win over the 1660Ti.
 
BTW how can anyone recommend Navi10 when it will be obsolete next year when even PS5/Scarlett will support RT?
I mean... It's easy watch. I recommend the 5700... Bam, done. Seriously though, do you know of anyone running around with a 2060 gloating of it's rt prowess? Yeah.. me either, so what good is it versus a same priced, faster and lower power 5700 just because the 5700 is missing a feature that can't be used? But.. but.. Nvidia logic. So, pretty easy to recommend. It would be much harder to recommend a 2080ti performance card without RT especially at a similar price point, but that's not the case here. I could see the XT maybe a bit harder to say, but 5700 is an easy choice ;). The 2070 super, still not to sure how well that will perform in future titles, so comparing that to something without RT still leaves it pretty open, although not as cut and dry.
 
Souldn't AMD aim for having RT on the next gen 5700? Why would AMD have any card that has less capabilites than a PS5/scarlet in the sub $400 market?
 
Souldn't AMD aim for having RT on the next gen 5700? Why would AMD have any card that has less capabilites than a PS5/scarlet in the sub $400 market?

Oh for sure. We don’t exactly know what Sony and MS will do but if RT is a marquee feature on 2020 consoles AMD will be right there shouting about their support too. Navi will be long forgotten by then.
 
  • Like
Reactions: Auer
like this
Souldn't AMD aim for having RT on the next gen 5700? Why would AMD have any card that has less capabilites than a PS5/scarlet in the sub $400 market?
Next gen? Possibly, really need to see it put to good use to know though. If it still sucks next year at that perfo Lance tier... Then why would/should they? If games start using it and seeing a benefit that's worth the performance cost, then absolutely. Why would they have less capabilities? For performance reasons? Cost? If it's a sub $400 card, chances are if they included dxr (rt) you couldn't use it anyways... Hence Nvidia drawing a line and releasing the 1660, etc series, because they know there is almost zero chance you could have turned on rt and used the card to reasonably play anything. A lot of people bought the 2080ti because it's the faster faster you on th market to play @ high fps.. some turn on rtx, others leave it off for the frame rate. I can't give an honest number, but I would say based on reviews and comments fmost leave it off most of the time (not to count the fact that in most games you can't turn it on). So you have a small percent that use the feature, while others would have been just as happy without it (or without the associated cost). Priced Right, there's still a market, and one that's not covered by nvidia.
 
I believe this thread is actually The Shimmer. When I steel myself to peek in and see what's going on, I know how lil Natalie felt.

cfec2648-6a69-48f1-a607-d95dbc545897-screen-shot-2018-02-21-at-103701-am.png

We sent in good people, but haven't heard back. But it looks okay...

Then screaming manbearpigs, RTX something something lighthouse, the laws of physics do not seem to apply here, Timmy has snakes in his tummy, and ERMAGERD ITS A CROCODILE.

Followed by the feeling I'd Like To Go Home Now.
 
I believe this thread is actually The Shimmer. When I steel myself to peek in and see what's going on, I know how lil Natalie felt.

View attachment 180926
We sent in good people, but haven't heard back. But it looks okay...

Then screaming manbearpigs, RTX something something lighthouse, the laws of physics do not seem to apply here, Timmy has snakes in his tummy, and ERMAGERD ITS A CROCODILE.

Followed by the feeling I'd Like To Go Home Now.

You confuse DXR with RTX?
 
This was news to me....this game will reqiure DXR support to run:
https://techgage.com/news/stay-in-the-light-a-game-that-requires-nvidia-rtx-to-play/

And devs seem to like DXR, more and more games supporting or demontrating support for DXR:

-Ark: Survival Evolved
-Assetto Corsa Competizione
-Atomic Heart
-Battlefield V
-Bright Memory
-Control
-Cyberpunk 2077 (this one has me all giddy)
-Call Of Duty Modern Warfare
-Dauntless
-Doom Eternal
-Dragon Hound
-Enlisted
-Grimmstar
-In The Black
-Justice
-MechWarrior 5
-Metro Exodus
-Sword and Fairy 7
-Vampires Bloodlines 2
-Watch Dogs Legion
-Wolfenstein Young Blood
-Stay in the Light
 
Sorry to bust your bubble pal, but you own a RTX2080, so you already payed for Ray-Tracing :D:D:ROFLMAO::ROFLMAO::rolleyes::rolleyes:

I remember when people bitched about AA being a frame rate killer.
Motion blur, AF, tessallation, DoF effects, volumetric shadows, Ambient occlusion have all taken toll.
The list goes on.

I think he meant money. but thanks for giving him a way out. :D:D

BTW how can anyone recommend Navi10 when it will be obsolete next year when even PS5/Scarlett will support RT?

Souldn't AMD aim for having RT on the next gen 5700? Why would AMD have any card that has less capabilites than a PS5/scarlet in the sub $400 market?


Did you even read my post, I clearly state myself. Or are you here just to mock people again..?

BTW, your anecdotes are feeble and your anger over how well RDNA has been received, is tell-tale. You are not a gamer, because Gamers and Developer's all love Navi. Seems you are just angered that people are laughing at Jensen and at Nvidia, for trying to over-sell their Turdling. Understand the facts, YOU (nor Jensen himself) have anything for broken "RTX On" hardware.

So, either jump on the RDNA bandwagon, or be prepared to be stepped all over (for the next 13 months), until Nvidia comes out with their 7nm wonder. Either way, this Xmas you will have the option of buying a Radeon XT 5800 (w/2080 Super performance) for about $499. And the only people getting angry over that, are the green fanbois.
 
This was news to me....this game will reqiure DXR support to run:
https://techgage.com/news/stay-in-the-light-a-game-that-requires-nvidia-rtx-to-play/

And devs seem to like DXR, more and more games supporting or demontrating support for DXR:

-Ark: Survival Evolved
-Assetto Corsa Competizione
-Atomic Heart
-Battlefield V
-Bright Memory
-Control
-Cyberpunk 2077 (this one has me all giddy)
-Call Of Duty Modern Warfare
-Dauntless
-Doom Eternal
-Dragon Hound
-Enlisted
-Grimmstar
-In The Black
-Justice
-MechWarrior 5
-Metro Exodus
-Sword and Fairy 7
-Vampires Bloodlines 2
-Watch Dogs Legion
-Wolfenstein Young Blood
-Stay in the Light


Too bad the ONLY hardware that can play those games unfettered by ray-tracing, is a $1,300 GPU. Which is going to look stupid, when $500 Consoles are doing ray tracing next year.

How much of a performance gain was the RTX2080 over the 1080ti again..? How much is Ampere going to offer (in performance) over Turing..? Is Nvidia going to offer low & mid-range cards with Ray Tracing next year. Will those lower res/tier cards be able to handle ray tracing, without a performance hit? Or once again, only playable on $1k+ GPUs..?




DXR is great, but RTX sucks.
Until we get un-broken ray tracing in games, it will remain a gimmick. 2020's hardware is when RT will come to life. Nvidia's Turing is already EOL.
 
This was news to me....this game will reqiure DXR support to run:
https://techgage.com/news/stay-in-the-light-a-game-that-requires-nvidia-rtx-to-play/

And devs seem to like DXR, more and more games supporting or demontrating support for DXR:

-Ark: Survival Evolved
-Assetto Corsa Competizione
-Atomic Heart
-Battlefield V
-Bright Memory
-Control
-Cyberpunk 2077 (this one has me all giddy)
-Call Of Duty Modern Warfare
-Dauntless
-Doom Eternal
-Dragon Hound
-Enlisted
-Grimmstar
-In The Black
-Justice
-MechWarrior 5
-Metro Exodus
-Sword and Fairy 7
-Vampires Bloodlines 2
-Watch Dogs Legion
-Wolfenstein Young Blood
-Stay in the Light
It says in the article it's going to support Pascal... Which means software dxr. From an indie dev that maybe 3 people on the [H] have heard of... Man, almost makes me want to run out and spend that $1200 in anticipation. Oh wait.. "Cowgill hopes to get the final release to a point where it will run okay on a GTX 1060". I can forgo that and buy a 1060 and it should run fine. Seems the use of dxr will be pretty minimal, and the game itself rather... Meh. You walk around a cave looking for an exit while a troll chases you around. If that type of game is what your basing your arguments on now, lol, I'm done.
 
Too bad the ONLY hardware that can play those games unfettered by ray-tracing, is a $1,300 GPU. Which is going to look stupid, when $500 Consoles are doing ray tracing next year.

How much of a performance gain was the RTX2080 over the 1080ti again..? How much is Ampere going to offer (in performance) over Turing..? Is Nvidia going to offer low & mid-range cards with Ray Tracing next year. Will those lower res/tier cards be able to handle ray tracing, without a performance hit? Or once again, only playable on $1k+ GPUs..?




DXR is great, but RTX sucks.
Until we get un-broken ray tracing in games, it will remain a gimmick. 2020's hardware is when RT will come to life. Nvidia's Turing is already EOL.
You complain of others not reading your posts, but didn't even read the link he provided. If you had, you'd realize the game is supporting dxr and targetted at running on a 1060, so dxr requirements are very low. Also, gameplay for that game seems.... Well, just read about it and see if you think it's the AAA that can push rtx to the top of my everyone's Christmas list.
 
You complain of others not reading your posts, but didn't even read the link he provided. If you had, you'd realize the game is supporting dxr and targetted at running on a 1060, so dxr requirements are very low. Also, gameplay for that game seems.... Well, just read about it and see if you think it's the AAA that can push rtx to the top of my everyone's Christmas list.

I said unfettered.
Factum has already posted links about Nvidia finalizing their support for DXR sdk. And I have already explained (ie: confronted) to him about it. Factum believes He is trying to show that Ray Tracing is a big deal and that everyone equates ray tracing to Nvidia. He doesn't understand how people see through such proprietary hoaxes. That DirectX Ray Tracing (DXR) was always coming in 2020 and has nothing to do with Jensen's teams trying to get "RTX On" (tensor + DLSS) to work in some games. I've confronted him and repeatedly asked him why does Nvidia have to have teams work with each Developer, to get their ray tracing, working?

Nvidia has teams working on each game and with each games developers to ad-hoc their way into saying RTX works... and hide the fact that Nvidia has been moving rays and choosing rays and finding ways to hide rays, because it's hardware is choking on DXR.



He has all the knowledge, but plays possum when confronted with facts.
Watch closely, how carefully he chooses his words and the subtle play with mixing DXR and RTX and Nvidia's Turing. But the facts are not on his side and this is only one discussion, but everyone knows that Turing was for creative content, not gaming. (Hence the irony of GTX DXR SDK link, which completely kills his Turing RTX gig)


I can only wonder why some hard bent People don't take a step back from it all, to let reality adjust their thinking. Perhaps Factum is becoming an AMD fan, by pointing out how bias some nvidia fans can be...? I have over 700h of gaming since I got my $800 RTX2080 and only 1h of that was seeing what Ray Tracing will look like next year when it is released for the public. RTX was forced on everyone who bought an upgrade from Pascal. Given the choice, I suspect that less than 5% of the Gamers would choose to pay the "RTX On" tax.

I know many people with older g-syncs who missed out on the 1080ti at retail and waited 1.5 years for the next 1080ti, and instead got stuck with a RTX2070, or RTX2080... and it's those captured g-sync owners who were FORCED to pay the RTX tax. When all they were looking to do is upgrade their Frames from Pascal. And perhaps a little cheaper than the 1080ti. Did any Nvidia fan see a RTX card that replaced the 1080ti (for the same price)? With Nvidia's failed and floppy RTX cards, You are paying for the exact same performance as you would if you bought a GTX1080ti. Except once the RTX cards were released, those who waited could go buy the 1080ti, because months before hand they started getting jacked up in price... forcing many just to buy a 3 year old Pascal card, this year. Or pay the RTX tax and deal with getting shafted.



Fact is Navi10 is a great Gaming GPU and the 5700 Series is the best price/performance for gaming right now. It is a great choice for anyone grabing a GPU off the shelf, as it will only get better and faster. The other fact is, that Turing GPU didn't turn out so great for Nvidia and Gamers, and are now stuck mid-life cycle without a new GPU for at least a whole other year.

This is doubly hard on Jensen and Nvidia fans, because neither of them anticipated RDNA. Tick, tock.... "big navi" and "bigger navi" incoming. (5800 & 5900 Series)
 
Last edited:
He has all the knowledge, but plays possum when confronted with facts.

You may not realize this but very little of what you post is factual or supported by technical realities. It mostly comes across as conspiracies / wishful thinking.

DXR and raytracing are not owned by or exclusive to nvidia. Repeated claims about how terrible RTX is and how awesome RDNA will be (based on what exactly?) are not facts.

The only facts are that raytracing is here, nvidia has the first useful implementation and it will only get better as all the other IHVs get on board in coming years.
 
As I said in other thread. We gotta give this guy A+ for effort. He keeps stating "facts" with no evicence to support them and keeps rebuting real facts with random attacks.

BTW I'm still waiting for the sales figures you surely have to sustain how Navi is a great seller (and I really hope it is, really). Not to mention all the citations several others have requested.

Maybe you just deserve B- after all.

:D:D:rolleyes::rolleyes:
 
If you do not want raytraing...you are SOL...enjoy PONG...the gaming wolrd is moving on.

I do rather like Pong.

I'll wait until the gaming world has already moved on, then I'll see what hardware is available at that time. I see no reason to pay extra just to have the tail wag the dog on Raytracing, it just doesn't move me in any way in it's current implementation. While there are a few (very few) games that support it, there are 0 games that ~require~ it, or even are significantly diminished for lack of it. For all of those who are willing to pay that early adopter tax - I salute you for subsidizing the development of my future card (and rather chuckle as you all try to justify to people like me with reasons other than "I just wanted to try it" -- which is a legit reason, but no one seems to be willing to say that). I'll wait until it becomes sexy enough for me to finally get on board, I end up getting it on board by happy accident without paying anything extra, or I absolutely need it for some reason.
 
Status
Not open for further replies.
Back
Top