Separate names with a comma.
Discussion in 'HardForum Tech News' started by cageymaru, Nov 15, 2018.
I just pulled mine out of the attic so I can be ready for ray tracing on the RTX 2070
I think there was a mention of disabling future rendering option to fix the input lag issue. I think the game looks nice with RTX on tbh. Just wish there was a bit more performance to work with.
The problem with disabling that option is that dx12 has severe random frame drops without it enabled.
Simple, because they wanted to sell cards.
Here is another example. Many years ago, I bought a 1080i ( interlaced ) non-p flat glass tube Sony HDTV ... 1080i was all over the packaging, the commercials, the brochures and came spewing out of the salesman's mouth at a very rapid pace.
Did I ever watch any broadcast 1080i content or any other type of 1080i content? No, never. But I was smart enough to know this ahead of time. I bought the set for other reasons. The size, it was massive at 36" ... the flat glass tube that had an excellent picture, etc.
Same difference with RTX. It's a new tech, the performance was revealed not to be that great when they showed off Battlefield 5. There are a handful of very good reasons smart and informed buyers should have never expected RTX to be a big deal.
17 frames a second with high RTX settings on @ 1080p ? Yeah, we have a hellva long ways to go. So let's be really super generous with 2nd and 3rd generation implementations of RTX in 2020 and 2022 ... let's give them a 100% increase in speed. So our 17 FPS goes to 34 FPS in 2020. And again, in the year 2022 we will give the RTX 3rd generation another 100% increase in performance and our 34 FPS goes to 68 FPS. Do you see where I am going with this?
None of you and I do mean none of you should have ever expected anything out of RTX. Not now and not for awhile. We may get some great games with RTX enabled in 2019 but I can't imagine the RTX implantation in those games to be awe inspiring. I also want to leave open the possibility that some very smart people may figure out some really cool stuff with RTX that could end up giving us some fantastic gains. It's a possibility that I look forward to.
You know what I'll give it to Nvidia... for pushing something game related to do with tensor cores.
As I have stated in other threads. There is no such thing as a RT core. Its marketing. However Nvidia (and I would say everyone else as well) is going to want to find uses for the tensor hardware the GPU manufacturers Server clients are all looking for.
Going forward NV AMD and Intel are all going to be producing GPUs with Tensorflow hardware that has really zero use for the average consumer.
Partial ray tracing is one possible use... and perhaps AA modes (not sure I buy the Nv server data set driving junk yet but we'll see). So ya kudos to Nvidia for getting their marketing machine turning first. Who knows 5-6 generations out this might even be a feature that defaults to on even on mid range gear. I'm not sure any of the game developers are competent enough to find novel uses for Tensor hardware on their own... so hopefully NV AMD and perhaps even Intel can dream up some more ideas to take advantage of the matrix math units on every GPU die going forward. (to bad someone like john carmack was't still focused on games... someone like that could perhaps find some cool use for Tensor hardware no one else has thought of yet)
Right now however of course it doesn't seem ray tracing is going anywhere. No one is dropping their games to sub 60FPS on their $1k+ GPUs outside of perhaps people wanting a few screen shots. Will be interesting to see NV AI driven AA idea fairs any better performance wise.
Well, rtx 2080ti performance over 1080ti in normal games doesn't justify its price hike., it is within the traditional high end to high-end expected performance increase between generations. It was only being considered OK due to ray tracing capabilities.
Will ray tracing causes a performance drop on 2080ti? yes of course, but definitely not as much as it is now as it is supposed to be built for it, drivers will recover some performance in the future for sure.
we've always seen this in the past when new cards supports a newer DX version, a moderate performance increase on the previous dx version, vs existing cards, but a much better performance than the previous generation of cards when titles on the newer DX comes out. this is not what the RTX is doing till now.
seeing this and the current HW issues, the RTX is a bad series, price wise and performance wise. Not that this will causes any issues for Nvidia as, thanks to AMD, people looking for the best new cards will buy the RTX as the GTX is out of stock.
If RTX is choking the cuda cores could a dedicated RT card be a possibility? Like the old PhysX days. Have a GPU with Cuda and a dedicated TracingCard. It would be exactly like the old days and eventually when it matures they can fit it onto a gpu, just like PhysX. I don't really know shit about the engineering but was just a thought. The other thought was they want to push into the GPU for hope of widespread adoption unlike the bumpy road PhysX has.
Didnt read it yet.
Did they test impact of number of cores and or cpu frequency?
More cores was supposed to be beneficial iirc.
RTX is the damn name of the card. They should have backed off the failtracing shit and just called it a GTX, because I don't see it being any use at all for ray tracing. It's pathetic to assume high end users want to A) use 1080p and B) run under 60Hz. Fuck off with that 2007 era shit.My old X800XT was pushing 1080p better than that, sure not with raytracing but decent lighting on lost coast.
It's like Ferrari selling a 488 Challenge (the track version) which can only do 160kmh/100mph when it's hot.
meh, not terribly impressed tbh. I actually think the rasterized lighting looks a bit better. It seems like ray tracing function was somewhat of an afterthought
edit: as a CG fan, I am happy that this tech is emerging though.
People that bought the new cards expecting to use ray-tracing got suckered.
Kinda [H]ard to argue with that!
Soul Calibur with all its flashy lights would be great for ray tracing.
I got one because I do research and dev on nvidia hw as well as game. The tensorcores will lead to applications that potentially do not require extremely cost prohibitive volta or tesla cards to do. I've said it before that 2080ti is not worth the cost and would implore those on the fence or not have the cash for it to buy something else.
If a game that does come out uses RTX features, then I think of that as a bonus at the moment.
CLEARLY no one remembers when AA was new or 'hdr' on the 6x00 line both which had the same effects on frame rate
While I can agree with the sentiment, the problem is NVidia isn't making games themselves, Software has a variance with optimization that can be staggering, an example go try playing POE at max quality and watch the frames dip regardless of how expensive your hardware is, great game crap engine, Diablo 3 has similar problems because it was a modified version of the SC2 engine so it has random dips for no reason especially with a WD around. There are a Number of games at Ultra that look amazing and run butter smooth on any level of hardware, some games I have had an EVGA 1060 6GB running 1440p Ultra settings 50-80 fps no problem.
I really think some perspective is necessary here.
Keep in mind that RT ray tracing has been the holy grail for computer graphics and is been difficult to achieve without monstrous processing power. And now suddenly it can be accomplished on a single video card. If you bother to think back there was a lot of talking heads saying likely we would NEVER see RT Ray tracing in gaming because it just required too much computing power. Do we really expect this first iteration it to be perfect? I would expect what we are seeing. Some limited application of RT Ray tracing; but real ray tracing. That is what we have and dang, I'm amazed. And I'm seeing posts of people complaining because of a performance it when it is enabled? Really? We are talking about a completely different paradigm here. One until quite recently it was believe to be out of reach.
I just have to say that I am a bit surprised at the "outrage" at the performance numbers. Real-time ray tracing has been the holy grail of rendering since the 90s. The fact that we have this at all in a package that fits an everymans computer at a relatively reasonable price point is absolutely astounding. That people still find a way to complain about it sadly is not.
So with all the settings maxed, you get 49 FPS...
I want to read the [H] review... They figure out what settings are playable, and will tell us how it feels too.
And all those same settings they find playable, tested on a 1080Ti and the top end AMD card as well to compare. Since it is a DirectX implementation, can Raytracing be enabled on 1080Ti or AMD cards? Because if it can, and we can see those all compared apples to apples, would be interesting.
We should never let the drive of consumerism engulf us. We should be amazed by amazing things. Today you go to a Ford dealer and buy a 500hp Mustang making that power with small 302 cube engine. That over twice the power my '89 Mustang GT made with the same displacement engine. The computing power we take for granted today is amazing too yet so often our attitude is "meh".
Be amazed. Because the time we live in could scarcely be imagined a couple generations ago.
I game at 800x600 in 2018 with my 2080ti. RTX baby. Just wait though until those drivers mature. Then I may be able to use the features plastered all over the box without a gigantic performance hit.
I don’t apologize for Nvidia. People who are buying cards for tech that hardly exists, for sky rocket prices, are only encouraging them to keep up the shenanigans. I voted with my wallet. Looks like my 1080 will last yet another generation.
Until Kyle and crew review RTX games I'm not buying any glowing positive or negative reviews. I'm just tired of people conflating their own exceptions and their expected $/value on things and determine them to be globally true for everyone and their mother.
im too busy trying not to get killed than looking at reflections on puddles. RTX OFF
That's because it's not actually outrage at the root of it, but frustration or annoyance that the new shiny has been placed out of reach on expensive GPUs. Or its people with an axe to grind with Nvidia/DICE/EA or whomever and don't want to consider an interesting new render tech simply on its own merits.
The only people who are complaining are the ones who can't afford the card in the first place and then the uninformed and clueless. A lot of us already knew ahead of time not to expect anything out of RTX. I could give a rats ass if it's the name on the card. It still doesn't mean anything. People are silly to expect great things out of real time ray tracing. BTW, I own a 2080 Ti. I bought it for the raw performance.
Don't forget the deplorables and filthy working class folks!
Yep the ignorant "you cant afford it" comments keep coming in. I said from the launch that this would be the typical go to move for those that get a hard on trying to feel superior when defending their $1200 purchase.
Wow... I wonder if SLI improves things. Hey Kyle you going to test if TWO 2080 ti's make 1440p playable?
2400 dollars in video cards to get decent FPS in 1 game. Yea... no thanks. lol.
More like people who have more money than sense. I know plenty of people who make far less than I that are far more incredibly foolish with money.
You want to spend 1200 bucks for a gpu that can't play a game with all the eye candy on max? More power to you
Hope you find a budget DP to VGA adapter that works.
First of all, let's get crystal clear on a few things. Anyone can save, anyone can plan ahead. Most of you have crap laying around you could sell. No one is feeling "superior" ... in fact, I wish AMD would get better people working in the graphics division so we would have some competition. We don't and you AMD people will not anything, again, for awhile yet and when you do, don't expect much. I didn't enjoy paying $1,300 dollars. Anyone who thinks this is an idiot. So, no, you have it all wrong, clearly. In fact, your use of ignorant is ... ignorant. Let's get with the game plan.
Before you want to defend something, understand the "something" you are trying to defend.
If anyone here would like to buy an RTX 2080 Ti ... message me, I'll spend a few minutes giving you some great advice on how to get one with a few simple changes you can make along with some great resources. Despite the ignorant this and superior that that is being thrown around I really have tried to help people. I often talk about how easy it is. And it is. So I stand behind what I said. I've very well aware who is complaining and the reasons behind the complaints. Real talk.
I suppose the GTX 2060 will be the card without RTX
Going by Techspot's review, it looks like the 2070's problem is that, it has insufficient RT cores.
Maybe NVIDIA should have released a 2070 ti with same number of RT cores as the 2080, instead of the 2070 with insufficient number of RT cores
nintendo switch resolution with DLSS applied to upscale to 1080p, should work well, I guess
Yeah 2070 is a waste for RTX but a reasonable upgrade for those looking into the 1080 performance level but with the possibility of DLSS down the road. [H] did a good real world usage review about 2-3 weeks ago.
Wouldn't be surprised. In fact expecting it. The whole scrap yard/mix & match they made of Pascal continues to unfold so it's entirely possible we'll have 5-10 variations in 10-18 months.
edit: Found a wiki of what I'm talking about in regards to the Pascal releases. I don't believe it has all the versions(the newest 1060 isn't listed) but even then there's 17 versions from 1030-1080TI.
Almost makes me miss my old 1366x768 CRT. Thing weighed a ton but was amazing for it's time.
I joked earlier this summer about pulling my old 27" 1080p/120hz/3d Asus out of the closet for these when the first RTX performance #'s leaked. Well, it's been sitting at the corner of the table and reconnected to my 1080 SLI rig because of SOTTR and KCD. Who knew DVI-D would still be useful.
no, the input lag says heeeeellll noo
Would you just stop already. Every thread about "jealousy" or "can't afford it." It has absolutely nothing to do with saving or selling stuff around the house to buy one. We get it. You love your card, but there are plenty of us out here who would buy in a second at $699-799 for a raw horsepower card, but refuse to pay ridiculous prices for beta testing RTX features and paying and extra $400-500 for the privilege of doing so. I like gaming, but I don't like it enough to drop $1200+ when I feel like I'm getting ripped off. And I certainly am not going to pay $800-900 for a card when I can get an equivalent one for $550-600 (2080 vs. 1080Ti) or pay $500-600 for a card I can get for $350-400 (2070 vs. 1080) when the new card can't even run the new features being advertised.
Let's not pretend yours is the only opinion and everyone else is "ignorant" as you say. I'll even grant that you think you are helping people, but it really comes down to the fact that some people (myself included) are never going to give Nvidia $1200 for this card simply because they don't think it's worth $1200 regardless of the raw power or the lack of competition. If it were $800, I'd buy one (or at least think about it).
Did you even look at Nvidia's promotional materials? "When it comes to next-gen gaming, it’s all about realism. GeForce RTX 2070 is light years ahead of other cards, delivering truly unique real-time ray-tracing technologies for cutting-edge, hyper-realistic graphics." The #1 new feature for 2070 is Ray Tracing, and it just can't do it fast enough to be playable on anything but the most basic settings. So clearly Nvidia isn't thinking like you do about how its "just" a raw horsepower upgrade over the previous generation otherwise, maybe it would be priced like the previous generation.
Exactly. When you have to ask yourself (is it worth it) you're basically trying to convince yourself that it is, because in reality you know it isn't. And that's a problem for Nvidia.
IMO Nvidia's candle is burning on both ends, not only is there hardware out of reach for most people (realistically) but their performance with RTX falls on its ass. Not good.
Lol real talk? You are so oblivious to the points being made that it is laughable. The fact that you still think its all about being able to afford it just reinforces what I said about your comment being ignorant.