Developer sees no reason to add promised RTX support

Gideon

2[H]4U
Joined
Apr 13, 2006
Messages
3,523
https://www.overclock3d.net/news/so...i8LsZ8--tRW0Tr-UO9rCd2cdc_tOaCKrsBMkNAtjx-J6Q


Developer sees "no reason" to add promised RTX support to Assetto Corsa Competizione
Nvidia RTX support still remains rare in the world of PC gaming, despite the company's best efforts to bring developers on board. Now, it looks like some of Nvidia's previously announced RTX titles will not receive graphical updates for quite some time. Now, a developer from Kunos Simulazioni has stated that they have "no reason" to spend development resources on the feature.

Ever since the reveal of Nvidia's RTX series of graphics cards, the company had created a list of games which "will feature real-time ray tracing". This list included Assetto Corsa Competizione, with Nvidia going so far as to use the game as an example of how ray tracing could impact in-game visuals. A Gamescom RTX demo for Assetto Corsa Competizione is embedded below.

On the official Assetto Corsa Forums, a staff member called "Aristotelis" released the following statement on RTX support within Assetto Corsa Competizione. In this statement, he commented that the company had a "long list of priorities" that trumped RTX support. Furthermore, Aristotelis noted that the company had "no reason to steal development resources and time for a very low frame rate implementation" of Assetto Corsa Competizione. His full statement can be read below.

"Our priority is to improve, optimize, and evolve all aspects of ACC. If after our long list of priorities the level of optimization of the title, and the maturity of the technology, permits a full blown implementation of RTX, we will gladly explore the possibility, but as of now there is no reason to steal development resources and time for a very low frame rate implementation."
 
Seems about right. Depends on how heavy of an implementation they were going for, but the statement about low frames leads me to believe it was probably requiring more rays than they could get from current hardware.
 
  • Like
Reactions: Curl
like this
Ouch, i hope this doesnt become a trend. I havent read anything about implementation difficulties until now. Is rt too time consuming to be economically integrated into games? Or is he speaking strictly to its in game frame rate being to low to bother with?
 
Sounds more like they want a bigger paycheck from nVidia

Can you blame them? This isn't a major studio pumping out AAA games. They probably have limited resources, and to pull people away from feature that benefit all customers to implement features that benefit only a very small fraction of their customers doesn't make sense. If Nvidia wants a wider adoption of ray tracing, they should be footing the bill.
 
I just assumed that nvidia would naturally be involved with development from the get go. Particularly because they are on the short list of rt titles being released. They should be giving them all the help they need to KEEP it on the list.
Id bet nv will reach out to them. Thats probably what the devs of acc are hoping for. By releasing that statement its obviously going to generate a ton of publicity which will motivate nv to help them.
 
I wouldn't be shocked if they agreed to implementing it before finding out the performance implications. Or that of the RTX lineup only like 2 Top tier GPUs would be capable of doing an even mediocre job of running it lol.

I appreciate the Devs honesty. Essentially, "We have bugs to fix and content to add. We may add this eventually, but it's at the bottom of our list."
 
I wouldn't be shocked if they agreed to implementing it before finding out the performance implications. Or that of the RTX lineup only like 2 Top tier GPUs would be capable of doing an even mediocre job of running it lol.

I think this probably isn't far from the truth. I think they promised it knowing it would get them some free press and attention, and that maybe it might result in more sales. Maybe they got some money and/or dev support from nVidia to include it, maybe not, but it's pretty typical for that to be included.

I could see them having announced it w/o any promise of support from nVidia just to get on the radar, but I don't know if that's the case and I kind of doubt it given the press it's got from nVidia themselves.

I'm 50/50 split on if it was "got in over their head" or "realized they could hold out for more money" -- could see it coming from either way really.
 
I think it's a very bold statement from the Dev if they did accept money to implement it. It would have been much safer to say, "we do plan to implement this, but we are addressing more critical bugs \ features first." Instead they pretty well drove RTX into the mud.

That said, employees can be stupid.
 
Last edited:
They probably did implement it, at least enough for the demo. But then realized it's trash and there's no fixing the performance aspect of it. So why burn man hours trying to make something work that will never work. At least not for their game.

I'd be interested in know the extent of their contract obligations with Nvidia on it. To make a statement like they did I have a feeling that they were only obligated to produce the demo.
 
  • Like
Reactions: Curl
like this
In this statement, he commented that the company had a "long list of priorities" that trumped RTX support.

Understandable, ATM that's what's gonna pay the bills. A more financially successful company could have done it all.

While I don't write off an Nvdiacash fishing attempt, It could also just be a lack of resources and an wish to stay financially independent.
 
In this statement, he commented that the company had a "long list of priorities" that trumped RTX support.

Understandable, ATM that's what's gonna pay the bills. A more financially successful company could have done it all.

While I don't write off an Nvdiacash fishing attempt, It could also just be a lack of resources and an wish to stay financially independent.
His final comment is he doesnt want to waste resources and time on "a very low frame rate implementation".
It looks like it isnt worth the effort anyway.
 
Sounds like Agile happened and the devs finally pushed back.

It is what it is.

Get wrekt, management.
 
Ouch, i hope this doesnt become a trend. I havent read anything about implementation difficulties until now. Is rt too time consuming to be economically integrated into games? Or is he speaking strictly to its in game frame rate being to low to bother with?
I think it's both. A lot of effort for small return. If it can only run on 2080 to at ok frames, and nobody else can use it... You're putting a lot of effort in for a very minimal benefit, especially if you have other features to work on that hit a much broader market.
 
I think it's both. A lot of effort for small return. If it can only run on 2080 to at ok frames, and nobody else can use it... You're putting a lot of effort in for a very minimal benefit, especially if you have other features to work on that hit a much broader market.
Yeah the 2080 and above is like a percent or two of the market. RT is currently pointless outside of a tech demo from a financial perspective.
 
Yeah the 2080 and above is like a percent or two of the market. RT is currently pointless outside of a tech demo from a financial perspective.
Yeah, like you said the market share is not big enough for developers to waste time on it.
I know they have no competition from AMD, but lowering the price might help the adoption?
 
"Our priority is to improve, optimize, and evolve all aspects of ACC. If after our long list of priorities the level of optimization of the title, and the maturity of the technology, permits a full blown implementation of RTX, we will gladly explore the possibility, but as of now there is no reason to steal development resources and time for a very low frame rate implementation."
Factum hardest hit ;)
 
Just seems RTX is PhysX 2.0 as far as how it will be adopted not the tech itself.

Anything to make it purtier and more realistic can't be bad just needs time. In 5 years we will laugh about this.
 
Yeah, like you said the market share is not big enough for developers to waste time on it.
I know they have no competition from AMD, but lowering the price might help the adoption?
Problem is then shareholders bitch. The Tu102/2080 series is the largest consumer GPU die in history.. they are also quite expensive to make, so their margins suffer.
This is the price they pay for doing what AMD did until navi (repurpose commercial parts) but with a GPU die twice the size. RTX is basically consumer application of tensor cores, probably same hardware with a software tweak but apparently the hardware is different. I have seen no proof of that though.. especially considering a Titan-V also does pretty well in RT with just 'tensor cores'...
 
Makes sense.There's zero reason for devs on a tight budget to prioritize raytracing in terms of revenue. The target market is just too small.

Big houses like DICE, EA, Unity and 4A are adopting it early because they're on the cutting edge of engine development. And of course Nvidia is likely pushing it hard with marketing dollars, developer time etc.
 
  • Like
Reactions: Auer
like this
Game devs should be getting dev kits for the next gen console in the near future. Them we will see them start to implement Ray tracing. By the end of the holiday season 2020 there will probably be more next gen consoles out there then rtx pc.
 
Promising it is reason enough. Or did nvidia add them to the list without contacting them first?
 
Sounds to me that RTX will be in the same boat as PHYS. Dead. No matter how hard they try to push it. my 2 cents
 
Sounds to me that RTX will be in the same boat as PHYS. Dead. No matter how hard they try to push it. my 2 cents
Ummm... PhysX is about the most popular physics engine in use today.... Just not the same as it used to be since.its now open sourced and used with shaders instead of proprietary hardware. RTX was just Nvidia trying to be out of the gate first, with full plans to support DXR and whatever implementation they put in vulkan, which appears to be based on nvidias current extensions. So while "RTX" may go away, the foundation and subsequent implementations will still be based off of it.
 
I think it's a very bold statement from the Dev if they did accept money to implement it. It would have been much safer to say, "we do plan to implement this, but we are addressing more critical bugs \ features first." Instead they pretty well drove RTX into the mud.

That said, employees can be stupid.


all depends on the contract but ultimately they never said it won't happen, just that they're not going to waste the resources pre-release if they don't have the time to implement it. doesn't mean it can't be added later. that being said they could also point the finger at nvidia for potentially making promises on the performance side that never ended up happening to make the game playable.

i still think the dev's are doing the right thing though. make sure the game you're making is actually good so it sells and then if it does consider adding a limited use case feature when you have the development time. because if they did it the other way around and it falls on it's face i'd guarantee nvidia's not paying enough to save that studio.

nvm apparently i'm an idiot and can't read anymore after working 26 days straight and didn't realize the game was already released.. most of my points still stand though.
 
Last edited:
  • Like
Reactions: Auer
like this
If you do something, and the results are shitty, people are going to bitch, and ain’t nobody got time for that
 
My beef with the "at this point" is the fact that we're 12 months post RTX launch, and there is very little to justify the significant price increase over non-RTX pascal cards.
My beef is that anytime anyone says RTX isn't worth the pricepoint, or exciting at this time, and lays out their honest opinion (emporer has no clothes) all the nvidia guys line up to bash you because you don't know how utterly brilliant it is to have single light source real time ray tracing built into your engine at 30FPS... Most scenes in most games you can't even tell the difference if ray tracing is on or off.

Honest
Owner
Opinion
 
My beef is that anytime anyone says RTX isn't worth the pricepoint, or exciting at this time, and lays out their honest opinion (emporer has no clothes) all the nvidia guys line up to bash you because you don't know how utterly brilliant it is to have single light source real time ray tracing built into your engine at 30FPS... Most scenes in most games you can't even tell the difference if ray tracing is on or off.

Honest
Owner
Opinion

after reading your post now that i think about it the dev's may actually be doing nvidia a favor by not releasing it because if RTX support was added and was completely unplayable nvidia's going to be the ones dealing with the backlash.
 
Problem is then shareholders bitch. The Tu102/2080 series is the largest consumer GPU die in history.. they are also quite expensive to make, so their margins suffer.
This is the price they pay for doing what AMD did until navi (repurpose commercial parts) but with a GPU die twice the size. RTX is basically consumer application of tensor cores, probably same hardware with a software tweak but apparently the hardware is different. I have seen no proof of that though.. especially considering a Titan-V also does pretty well in RT with just 'tensor cores'...

2080ti still kicks the V’s ass in RT and is a smaller die.

Funny enough, in their latest quarterly report their margins didn’t change. Still at ~60% for gross margin.

A racing game is definitely not one where you want low fps so if they can’t implement it right, not at all is probably best.
 
Last edited:
  • Like
Reactions: Auer
like this
Ummm... PhysX is about the most popular physics engine in use today.... Just not the same as it used to be since.its now open sourced and used with shaders instead of proprietary hardware. RTX was just Nvidia trying to be out of the gate first, with full plans to support DXR and whatever implementation they put in vulkan, which appears to be based on nvidias current extensions. So while "RTX" may go away, the foundation and subsequent implementations will still be based off of it.

Right but it took one of your SLI cards to run it properly or a dedicated physX card the FIRST few years to keep from bogging your system down. It's only when it was developed and trimmed to run as a CPU task did it get fully implemented in the last couple of years.

Nvidia fucked up majorly with the 20 series.

#1 Little to no SLI support. When you have a New Hardware based implementation 1st gen, you should push SLI and help developers support it. Sell more cards like when PhysX came out. Duh more money and be able to run RTX the way it's mean't to be!

#2 Pricing: Well you just couldn't help be to market first and sales were and are still stagnant in the home pc market because of the higher cost to develope this card which really should have been on the 7nm or 7nm+ process. But hey, We Were First mentality. $1200 for the Ti version. Have you lost your fucking mind? That's a nice house payment or 3 to 4 car payments, bullshit. If it wasn't for paypal and the credit spending gen, that card would still be sitting on shelves, full stock. And don't tell me you spent cash. You either live in moms house or rich with nothing better to do.

#3 Piss poor management that should have gotten Huang a fine or reprimand by his peers/board members. When you top tier card the 2080 comes out and is on par with a 1080Ti=lost sales. Look at how many of us bought 1080TI's at $650 or less when the 2080 dropped. And I still have no reason to buy anything new. Yes I know they have other areas of interest like AI automobile industry, but don't leave the gaming community as a second thought when they built your company. Yes we did.

#4 Should have concentrated on gaming now, not later. The 20 series should have all been able to run 4K resolution and 60 fps any game with the higher tier cards pushing 100 to 120 fps. They implemented their compute cards with gaming which was the wrong decision, the two should have stayed separated till 7nm or better was achieved. Ala. PhysX like card or like number 1 and 2 where you could afford 2 cards.

Well that IMO and my overall observation. Disagree cool. That's just how I feel why Nvidia failed with gamers.
 
My beef with the "at this point" is the fact that we're 12 months post RTX launch, and there is very little to justify the significant price increase over non-RTX pascal cards.

This is a common perception that RTX is somehow meant to justify Turing prices. The launch prices were incredibly stupid but look at the launch picture for GTX 1080 (full GP104) and RTX 2080 super (full TU104).

GTX 1080 launched at $700 May 2016. RTX 2080S at $800 July 2019. So $100 more for 50% more performance. That’s a 30% increase in perf/$ but the sad part is we had to wait 3 long years.

RTX has very little to do with that though. Turing added lots of shaders and a whole separate INT pipeline. That resulted in a 545mm^2 die vs 314mm^2 for Pascal on what’s essentially the same 12/16nm process.

You could argue that Pascal was also overpriced but based on manufacturing cost alone Nvidia isn’t taking exactly making bank with Turing.
 
  • Like
Reactions: Auer
like this
My beef is that anytime anyone says RTX isn't worth the pricepoint, or exciting at this time, and lays out their honest opinion (emporer has no clothes) all the nvidia guys line up to bash you because you don't know how utterly brilliant it is to have single light source real time ray tracing built into your engine at 30FPS... Most scenes in most games you can't even tell the difference if ray tracing is on or off.

Honest
Owner
Opinion

No kidding. I have one in my sig also (and I didn't pay full price either). God forbid you criticize the price point or the usefulness of RTX.
 
This is a common perception that RTX is somehow meant to justify Turing prices. The launch prices were incredibly stupid but look at the launch picture for GTX 1080 (full GP104) and RTX 2080 super (full TU104).

GTX 1080 launched at $700 May 2016. RTX 2080S at $800 July 2019. So $100 more for 50% more performance. That’s a 30% increase in perf/$ but the sad part is we had to wait 3 long years.

RTX has very little to do with that though. Turing added lots of shaders and a whole separate INT pipeline. That resulted in a 545mm^2 die vs 314mm^2 for Pascal on what’s essentially the same 12/16nm process.

You could argue that Pascal was also overpriced but based on manufacturing cost alone Nvidia isn’t taking exactly making bank with Turing.

The problem with your argument is you're looking at it in the best case scenario for Nvidia. The 1080 was never really $700 unless you wanted the founder's edition. The AIB versions started at $599 and dropped further after the 1080Ti launch. Likewise, the 2080 non-super hit in between with an asinine $799 price point when the 1080Ti was $699. It's not until recently that we have the performance boost you're talking about.
 
Back
Top