Titan A anyone?

It’d be nice but for me the memory I need, the grunt I can usually wait a bit longer so it’s less important as I’m not an impediment that costs money if I take a bit longer. Obviously I could theoretically survive on a Titan RTX too. In production it’s different but I don’t run that on office machines.

looking forward to actually experimenting again. Dunno what it is with remote machines but I never use them as much to play around with.
 
If it is for gaming then it will probably be not worth it. When/if they release a Titan don't expect it to be any less the $3000.
 
The 3090 isn't any kind of "A100", it is GA102 and, AFAIK, we don't know how many shaders are disabled. It's certain to be well shy of 13824, though.
 
Anyone waiting to see if a full chip Titan A comes along? 3090 is cool but it's a neutered Ampere A100.

A100 has "13,824" cores, while 3090 has "10,496". That 3,328 core difference would be over 30% more performance than a 3090.

https://www.anandtech.com/show/15801/nvidia-announces-ampere-architecture-and-a100-products

I'm gonna wait, and if that means I wait until nextgen, that's cool too.

Considering the 3090 is 5-10% faster than the 3080 I doubt the core difference will translate to 30% more perf.
 
If it is for gaming then it will probably be not worth it. When/if they release a Titan don't expect it to be any less the $3000.

Worth it to me if it is almost 14k cores. 3k... It is a Titan.


The 3090 isn't any kind of "A100", it is GA102 and, AFAIK, we don't know how many shaders are disabled. It's certain to be well shy of 13824, though.

Why is that certain? The last two Titans were full chip X100's.

Considering the 3090 is 5-10% faster than the 3080 I doubt the core difference will translate to 30% more perf.

You may need to check those 3090 benchmarks closer, so far 20% faster than 3080.


This threads a bummer so far, just a bunch of folks not in the market for one trying to play it down lmfao! :ROFLMAO:
 
Worth it to me if it is almost 14k cores. 3k... It is a Titan.




Why is that certain? The last two Titans were full chip X100's.



You may need to check those 3090 benchmarks closer, so far 20% faster than 3080.


This threads a bummer so far, just a bunch of folks not in the market for one trying to play it down lmfao! :ROFLMAO:
Not trying to play anything down. Is it going to be a beast? Sure. Just not worth the price if it is only for gaming. More power to you if you want it.
 
Worth it to me if it is almost 14k cores. 3k... It is a Titan.




Why is that certain? The last two Titans were full chip X100's.



You may need to check those 3090 benchmarks closer, so far 20% faster than 3080.


This threads a bummer so far, just a bunch of folks not in the market for one trying to play it down lmfao! :ROFLMAO:
Which benchmarks? The ones I've seen have been < 9% difference on average (8.825%) unless some new ones popped up?
 
Why is that certain? The last two Titans were full chip X100's.

The part that is "certain" is the CUDA core count of a fully-enabled GA102. GA100 will never come to desktop, firstly because it's nearly twice the transistor count of GA102, and secondly because it has some fundamental differences compared to the Ampere you're familiar with.
 
Not trying to play anything down. Is it going to be a beast? Sure. Just not worth the price if it is only for gaming. More power to you if you want it.

It is if you want the absolute best performance in gaming.

The part that is "certain" is the CUDA core count of a fully-enabled GA102. GA100 will never come to desktop, firstly because it's nearly twice the transistor count of GA102, and secondly because it has some fundamental differences compared to the Ampere you're familiar with.

There's a PCIE version of A100, or what differences do you speak of? That argument is still just opinions.


https://videocardz.com/newz/nvidia-geforce-rtx-3090-3dmark-time-spy-scores-leaked

That doesn't take into account games that will use the significantly greater RT and Tensor core count of the 3090.
 
It is if you want the absolute best performance in gaming.

You know the Titan hasn't been a gaming card since Volta right? Nvidia has been moving the Titan brand out of the consumer market. It's aimed at more professional workloads.
 
  • Like
Reactions: Dan_D
like this
You know the Titan hasn't been a gaming card since Volta right? Nvidia has been moving the Titan brand out of the consumer market. It's aimed at more professional workloads.

That's weird because my Volta Titan card has been gaming just fine... hmm funny. I also recall Titan RTX performing better than the 2080ti in games, I guess they are still up to the task no matter the "label," and even better at it than a "gaming" card. Maybe I should do what the box it comes in tells me instead of what reality tells me :ROFLMAO:

This thread's really drawing out some straight up bullshit lmfao. Guess folks here aren't [H]ard anymore and use their own brain, manufacturer brochures are working.
 
It is if you want the absolute best performance in gaming.



There's a PCIE version of A100, or what differences do you speak of? That argument is still just opinions.



https://videocardz.com/newz/nvidia-geforce-rtx-3090-3dmark-time-spy-scores-leaked

That doesn't take into account games that will use the significantly greater RT and Tensor core count of the 3090.

Your only source is 3dmark time spy?

Checks out.

Nobody is saying titan isn't faster, just that the percent increase in performance is not reasonable for the percent increase in cost. If you have money to burn on incremental performance increases then enjoy your 3090.
 
Your only source is 3dmark time spy?

Checks out.

Nobody is saying titan isn't faster, just that the percent increase in performance is not reasonable for the percent increase in cost. If you have money to burn on incremental performance increases then enjoy your 3090.

Checks out.

Another Soft Forum post somehow leaking over to [H] :woot:
 
We don't know on that yet. Some leaks suggest the difference is 10% others, suggest the difference is closer to 20%. In any case, we need to wait and see.

There is more evidence pointing to 10% than 20%. All I'm saying is if the 3090 is a tiny bit faster than a 3080 with substantially more cores, I can't see a titan scaling much better.

In any case, who cares? I fail to see why this dude is getting offended by people saying the 3090/titan is a poor value proposition. It objectively is, there's nothing to argue If he has the money to blow on it, cool. Hope you get the performance you want for the $ you spend Stryker.
 
Last edited:
There's a PCIE version of A100, or what differences do you speak of? That argument is still just opinions.

That doesn't actually mean anything and it isn't just my opinion. Have you noticed how many CUDA cores NVIDIA states are on GA100? Also, it is not even remotely true that Titans are always full-enabled XX100 chips. Go back just ONE generation to the Titan RTX and you'll see that it is a TU102-derived chip.
 
That doesn't actually mean anything and it isn't just my opinion. Have you noticed how many CUDA cores NVIDIA states are on GA100? Also, it is not even remotely true that Titans are always full-enabled XX100 chips. Go back just ONE generation to the Titan RTX and you'll see that it is a TU102-derived chip.

What technical information do you have to make it not viable. The cuda core count on the consumer cards just released isn't the actual physical count, they doubled them for marketing purposes based for their ability to perform twice the performance as a previous cuda core. I doubled the A100's count to come up with "13,824" in accordance with nvidias new way of calculating cuda cores.

Touche' on the Titan RTX.

Check it out, I started this thread to see if there was like-minded folks that would contribute to the idea of a possible Titan A, not to make arguments about price, not shit it's going to be expensive, or price/performance value propositions. No one buying a Titan is looking for value, they want the best, and don't care what you think about the cost.
 
Last edited:
Checks out.

Another Soft Forum post somehow leaking over to [H] :woot:

Hard Overclockers Comparison Page.

The entire original point of this community was to push the limits of parts to determine if the higher expenditures of "elite" hardware was even worth the price tags instead of just buying a much cheaper part, OC'ing the snot out of it, and coming close to, meeting, or exceeding the performance of their highest priced brethren, given a specific usage scenario.

All data is subjective. If you feel that the performance increase for an RTX Titan Ampere at $2000+ over an RTX 3090 at $1500 is justified (or an RTX 3090 at $1500 over an RTX 3080 at $700-800, or an RTX 3070 at $500 vs the RTX 3080, etc), then more power to ya.
If you absolutely have a desire to have the most expensive top tier products from a given company's portfolio in your home PC no matter the cost, then go for it: I think you'll find yourself in the vast minority of HOCP'ers, though...
 
What technical information do you have to make it not viable. The cuda core count on the consumer cards just released isn't the actual physical count, they doubled them for marketing purposes based for their ability to perform twice the performance as a previous cuda core. I doubled the A100's count to come up with "13,824" in accordance with nvidias new way of calculating cuda cores.

The CUDA core count was not doubled for marketing purposes. The count is in keeping with how a CUDA core has always been counted. Also, 13824 still isn't two times 8192.
 
Ye, it was.

No. It wasn't. A CUDA core is a floating point unit within the SM. Each SM is able to process two FP32 instructions simultaneously now (as opposed to one previously), hence the doubling of CUDA cores.

From the GA102 White Paper:
Most graphics workloads are composed of 32-bit floating point (FP32) operations. The Streaming Multiprocessor (SM) in the Ampere GA10x GPU Architecture has been designed to support double-speed processing for FP32 operations. In the Turing generation, each of the four SM processing blocks (also called partitions) had two primary datapaths, but only one of the two could process FP32 operations. The other datapath was limited to integer operations. GA10x includes FP32 processing on both datapaths, doubling the peak processing rate for FP32 operations. As a result, GeForce RTX 3090 delivers over 35 FP32 TFLOPS, an improvement of over 2x compared to Turing GPUs.

The block highlighted in red below used to only be able to process integer workloads, and therefore was not included in the count of CUDA cores.
hLZEs8n.png
 
Last edited:
No. It wasn't. A CUDA core is a floating point unit within the SM. Each SM is able to process two FP32 instructions simultaneously now (as opposed to one previously), hence the doubling of CUDA cores.

From the GA102 White Paper:


The block highlighted in red below used to only be able to process integer workloads, and therefore was not included in the count of CUDA cores.
View attachment 281330

I'll buy that, thanks for the knowledge.

That makes me wonder if anything better than the 3090 will be released this gen, hmmm. Be nice to know it's a full chip or not? Or if there will be a 3090 Ti
 
That's weird because my Volta Titan card has been gaming just fine... hmm funny. I also recall Titan RTX performing better than the 2080ti in games, I guess they are still up to the task no matter the "label," and even better at it than a "gaming" card. Maybe I should do what the box it comes in tells me instead of what reality tells me :ROFLMAO:

This thread's really drawing out some straight up bullshit lmfao. Guess folks here aren't [H]ard anymore and use their own brain, manufacturer brochures are working.

Just because you are buying a Titan card for Epeen doesn't change the facts.

Of course you can game on the Titans, never said otherwise. But you are paying double the price for an increase in performance that isn't noticeable in games. The Titan RTX smashes the 2080Ti in DP workloads but is barely any faster in games.

Paying a a silly amount of money for Titan V card for gaming doesn't make you hard. It makes you stupid. I will retract this statement if you are using it for professional work too.
 
Just because you are buying a Titan card for Epeen doesn't change the facts.

Of course you can game on the Titans, never said otherwise. But you are paying double the price for an increase in performance that isn't noticeable in games. The Titan RTX smashes the 2080Ti in DP workloads but is barely any faster in games.

Paying a a silly amount of money for Titan V card for gaming doesn't make you hard. It makes you stupid. I will retract this statement if you are using it for professional work too.

Triggered lmao. See post 21:

"Check it out, I started this thread to see if there was like-minded folks that would contribute to the idea of a possible Titan A, not to make arguments about price, no shit it's going to be expensive, or price/performance value propositions. No one buying a Titan is looking for value, they want the best, and don't care what you think about the cost." ;)

What's stupid is the clowns on a Titan thread crying about a Titan.
 
Last edited:
This threads a bummer so far, just a bunch of folks not in the market for one trying to play it down lmfao! :ROFLMAO:
Bummer? No, this is exactly what should happen. I love seeing people talk about how the 3090 won't be worth it. The 3090 is a terrible value.

*checks bank account*

Nobody should buy one.

*sets up links to all major etailer 3090 pages*

10% is laughable, you should just buy the 3080 instead.

*dabs silicone lube on F5 keyswitch*
 
;)
Bummer? No, this is exactly what should happen. I love seeing people talk about how the 3090 won't be worth it. The 3090 is a terrible value.

*checks bank account*

Nobody should buy one.

*sets up links to all major etailer 3090 pages*

10% is laughable, you should just buy the 3080 instead.

*dabs silicone lube on F5 keyswitch*

Well it's a good thing this thread is about a Titan and not a 3090.

Checks bank, has hard earned money that I can't take with me when I die so I buy whatever the F I want to have the best experience while alive.

Checks thread sees people crying about price/performance ratio when no one cares about your whining in a Titan thread.
 
;)


Well it's a good thing this thread is about a Titan and not a 3090.

Checks bank, has hard earned money that I can't take with me when I die so I buy whatever the F I want to have the best experience while alive.

Checks thread sees people crying about price/performance ratio when no one cares about your whining in a Titan thread.
psst

i'm on your side buddy

go buy a titan if they release one
 
The only comparison that matters is to previous TITANS.
In the meantime I will have a 3090 while they make up their mind about releasing a TITAN.
 
You guys are really jumping the gun. I'm just looking forward to seeing what the 3090 can actually do.
 
The only comparison that matters is to previous TITANS.
In the meantime I will have a 3090 while they make up their mind about releasing a TITAN.

I disagree. There is a market for people who want the fastest video card available for gaming regardless of price. I've purchased Titan X's and an RTX 2080 Ti in the past for that purpose. The price of the 3090 isn't a deal breaker for me.
 
I bought a Titan Xp because I wanted the fastest gaming GPU possible and at the time there was no 1080 Ti. I skipped the RTX 2000 series because the 2080 Ti was ~30% faster than what I already owned and not worth $1300 to me.

I think the 3080 is an excellent product, but if the rumors are true, I don't a 115% price increase over the 3080 is worth it for 10-15% more performance.

At the end of the day, the performance needs to translate into a tangible benefit. Is 10% faster frame rates going to be a measurably different experience? Probably not. Maybe I'll be surprised. But it seems unlikely.

Given the value proposition of the 3090 I expect the Titan cards when they appear will be even worse and targeted even more squarely at professional work.
 
Well the 3090 is cut down so there definitely will be a Tesla-class card, maybe 48gb full-fat chip. As far as whether Nvidia will actually create a Titan SKU, I think that depends on AMD. If they release a card that threatens 3090 in any way, then they can just do a price drop and debut an uncut Titan card.
 
I think it's much more likely we'd see a 3080 Ti than a Titan in the short term, but there's not a lot of room left in between the 3080/3090 cards if the 3090 is only 10% faster.

I'm curious where the bottleneck is.
 
Yes, most definitely getting an Ampere Titan. Was hoping to get two to do Ampere Titan SLI but with them all but killing off SLI completely (even older DX11 games), not sure if that's even a possibility. However, at least one is a must! :D

I'm good with my 2x Titan RTX rig - trying to find the damn 3090 in stock for my 2nd rig
 
Does anyone know what the TDP of such a card would be?

I'm thinking of getting one too, to replace the aging Titan X Pascal I currently have. Trying to figure out how small of a case I can get away with for one of these.
 
Back
Top