Rumor Mill: RTX 2080?

DooKey

[H]F Junkie
Joined
Apr 25, 2001
Messages
13,545
Good ol' AdoredTV has pushed out a new video on YouTube that speculates on the new RTX 2080, you read that correctly, that will feature a Turing GPU with ray tracing technology. Also of note in his video is the new Turing cards aren't going to be significantly faster than the current generation. As always you should take rumors with a grain of salt, but it's fun to hear what everyone has to say. If you have the time you should check out his video.

Watch the video here.
 
Setting aside the bathtub full of salt for a moment: If this is real, if the numbers are real, and if the prices are real the RTX 2080 would definitely be the card I want, especially if the price leans more towards the $500-$600 range instead of $700. Otherwise I'll probably try to snag a cheap 1080 ti. My 1080 just doesn't cut it at 4K and I need something at least a semi-decent amount faster.
 
the new Turing cards aren't going to be significantly faster than the current generation

??

His source says the 2080 will be 50% faster than the 1080 (7:00 mark), but only slightly faster than the 1080 Ti. It also says the 2070 will be 40% faster than the 1070 (10:43). That's pretty significant. That's enough to make me tempted to replace my 980Ti with a 2080 and not wait for the 2080 Ti, if there will be one.
 
??

His source says the 2080 will be 50% faster than the 1080 (7:00 mark), but only slightly faster than the 1080 Ti. It also says the 2070 will be 40% faster than the 1070 (10:43). That's pretty significant. That's enough to make me tempted to replace my 980Ti with a 2080 and not wait for the 2080 Ti, if there will be one.

I was comparing what the 1080 did to the 980 Ti. I guess it depends upon your point of view.
 
??

His source says the 2080 will be 50% faster than the 1080 (7:00 mark), but only slightly faster than the 1080 Ti. It also says the 2070 will be 40% faster than the 1070 (10:43). That's pretty significant. That's enough to make me tempted to replace my 980Ti with a 2080 and not wait for the 2080 Ti, if there will be one.

Probably because people are used to the x70 being closer the x80 ti of previous gen, not the x80 of the new gen being close. I still think it's a decent bump myself!
 
Setting aside the bathtub full of salt
But look at what they get you!

johnmcafee_uninstall-625x352.jpg
 
Probably because people are used to the x70 being closer the x80 ti of previous gen, not the x80 of the new gen being close. I still think it's a decent bump myself!

If all the details are right the 2070 will be a killer card, even if it launches around $450-500.
 
So those of us with 1080 Ti's remain in the same position that people with 980 Ti's were in when the 1080's dropped.....you'll be ok, wait for the Ti parts in 12-16 months.....8%? No. 50%? Yep. $3 grand for 50%? No. $800 for 50%? Yep. Ti in 12 months.
 
I hope the price on the RTX Titan is wrong. I plan on buying the next Titan, but not at that price. :(
 
  • Like
Reactions: ltron
like this
Yeah that Titan price is def wrong. That $3,000 price is what that other card was maybe 6 months ago, that was mostly a scientific focused card? Not sure how the prices got mixed up.

Kids don't have $3,000 for a Titan card.

I've got my $1,000 - $1,200 for new Titan. I would buy one right this minute if I could.

Not sure how many Titan cards have come out but I know of at least 2 that shipped over the past 4 or 5 years for around $999 or $1,100 dollars.

The 6gb Titan from 2013 and then the Pascal Titan from a few years ago. Were there any others?

50% faster than a 1080 Ti for $1,200 dollars? HELL YEAH!
 
Last edited:
Yeah that Titan price is def wrong. That $3,000 price is what that other card was maybe 6 months ago, that was mostly a scientific focused card?

Kids don't have $3,000 for a Titan card.

I've got my $1,000 - $1,200 for new Titan. I would buy one right this minute if I could.

I could believe it. Nvidia has no competition in the dedicated spce right mow. Who knows when, or if, AMD will ever put out something competitive so Nvidia can price halo cards like that where ever they want.
 
Yeah that Titan price is def wrong. That $3,000 price is what that other card was maybe 6 months ago, that was mostly a scientific focused card? Not sure how the prices got mixed up.

Kids don't have $3,000 for a Titan card.

I've got my $1,000 - $1,200 for new Titan. I would buy one right this minute if I could.

Not sure how many Titan cards have come out but I know of at least 2 that shipped over the past 4 or 5 years for around $999 or $1,100 dollars.

The 6gb Titan from 2013 and then the Pascal Titan from a few years ago. Were there any others?

50% faster than a 1080 Ti for $1,200 dollars? HELL YEAH!

That Titan was also not called "GeForce".
And a good bargin for people not able to afford a true Tesla card...75% of the performance (No ECC though) for 33% of the price of the Tesla offering....people need to pull their head out their *bleeeps* and engage the grey matter...but alas...this happens EVERYTÌME it becomes "Silly Seaon" *sigh*
 
Yeah unfortunately this all seems much too plausible to be ignored.

It sounds a lot like the Kepler 600/700 series where the big GPU doesn't trickle down to consumer line until the next "refresh". The original Titan didn't become a GeForce card until the 780/Ti "next generation" which was just a Kepler refresh. This likely means that the 2080Ti may just be a fully enabled 104 GPU instead of the actual big GPU.
 
ray-tracing built into the hardware will take at least another generation to work out the kinks...plus it'll take years for games to actually implement this in any meaningful way
 
If the titan is going to be geared for double precision and have good acceleration for AI it woukd be a good value, unlike the last few.

Seems like my 1080ti will only be dethroned by the 2080ti so no harm there.

Wonder if the 2030 will be gimped?
 
I really think this "leak" is nothing but hot smoke. No way a new titan is coming out at 3k... they already have Titan V for that... I would be really surprised if Nvidia manufactured two architectures this time around because of the mining craze. I also call bullshit on the RTX name scheme... Its not like next gen's graphics card are the end all be all in ray tracing... I'd take this "LEAK" with a very big grain of salt.
 
So those of us with 1080 Ti's remain in the same position that people with 980 Ti's were in when the 1080's dropped.....you'll be ok, wait for the Ti parts in 12-16 months.....8%? No. 50%? Yep. $3 grand for 50%? No. $800 for 50%? Yep. Ti in 12 months.

Plenty of people ditched their 980ti's for a 1080 for some bizarre reason or other. I expect this will be no different, a sidegrade at best if those numbers are accurate.
 
  • Like
Reactions: ltron
like this
??

His source says the 2080 will be 50% faster than the 1080 (7:00 mark), but only slightly faster than the 1080 Ti. It also says the 2070 will be 40% faster than the 1070 (10:43). That's pretty significant. That's enough to make me tempted to replace my 980Ti with a 2080 and not wait for the 2080 Ti, if there will be one.
Same here. The wait for the Ti would get too long in the tooth with us needing the power for 4K now.
 
ray-tracing built into the hardware will take at least another generation to work out the kinks...plus it'll take years for games to actually implement this in any meaningful way

That was my thought too while watching the video. This is like the first cards released at the start of a new generation of DirectX. My ati 5850 was DX11 capable in late 2009 and I bet I didn't play a dx11 required game till 2011-12.
 
That was my thought too while watching the video. This is like the first cards released at the start of a new generation of DirectX. My ati 5850 was DX11 capable in late 2009 and I bet I didn't play a dx11 required game till 2011-12.

that new Metro game is supposedly supporting ray-tracing...but it sounds more like a prototype demo versus full support throughout the game...

 
The 980ti has a TDP of 250W while the 1080's TDP is 180W, so easier for cool and quiet gaming. As you say, it's bizarre to consider TDP and having better performance on top of that :p

At the time though the 1080 was only available in the FE variety, all i seen was people ditching third party cards like the strix and the gigabyte triple fan card for 1080 fe cards. Just shows people like to have the latest new toy regardless if it makes sense or not. :oops:
 
I was comparing what the 1080 did to the 980 Ti. I guess it depends upon your point of view.

probably you are referring to 1080 to 980?.. 1080 to 980ti wasn't a large jump either specially factoring overclocking. however if we receive just exactly the same performance jump from 980 to 1080 and 980ti to 1080ti we would be speaking of 50% - 70% which it's A LOT, however I don't have any reason to believe the performance jump will be in that range, instead more in the 30% up to 50%.
 
Having just watched the video, he never actually claims to have verified the source. E.g. a photo of an employee badge or something that he himself saw.

He just claims to believe the material sent to him. But anyone with Photoshop and a eye for details could have produced these visuals.
 
If true, nVidia would be the first to bring a ray tracing card to market for commercial/mainstream release but they wouldn't actually be the first though. Caustic Graphics had prototypes nearly a decade ago through but far from real time. They were bought out by Imagination a few years ago and the technology rebranded as PowerVR Wizard (vs. PowerVR Rogue architecture for mainstream raster graphics, insert joke about red warrior needing food). Under Imagination, the developer hardware from a yew years back was able to tackle real time being 720p and 30 FPS but that is an accomplishment. However, about a year ago Imagination sold itself to Canyon Bridge after Apple decided to stop using their GPUs for iOS devices. So unless Canyon Bridge has secret plans for bringing forth a discrete GPU, nVidia would be the first for pushing ray tracing into the mainstream.
 
Plenty of people ditched their 980ti's for a 1080 for some bizarre reason or other. I expect this will be no different, a sidegrade at best if those numbers are accurate.

Yeah you get everyone trying to bail out of their rapidly-depreciating hardware, problem is the only place to jump to is one step up. I don't like that upgrade cycle, if I'm spending big bucks I want to feel like I'm getting big improvements, I'm on Ti to Ti right now and that seems to work.
 
Setting aside the bathtub full of salt for a moment: If this is real, if the numbers are real, and if the prices are real the RTX 2080 would definitely be the card I want, especially if the price leans more towards the $500-$600 range instead of $700. Otherwise I'll probably try to snag a cheap 1080 ti. My 1080 just doesn't cut it at 4K and I need something at least a semi-decent amount faster.

https://smile.amazon.com/dp/B0722YB...olid=3O16CWH1SEEQC&psc=0&ref_=lv_ov_lig_dp_it 659.99? cant sum up that extra 60?
 
I'm interested, my 1080 is getting old :p I do hope they come with the latest DP and HDMI stuff so we can try to do 144hz 4k if they ever bring out monitors that can handle that.
 
Ray-tracing in ANY form other than ShillWorks™ implementations is at least 8 years off UNLESS AMD has something like that for the new consoles.
 
I'm so behind on the times I don't even know what to compare this to. What is the latest Nvidia GPU?

This is what happens when you don't build a pc for 2 years.
 
Plenty of people ditched their 980ti's for a 1080 for some bizarre reason or other. I expect this will be no different, a sidegrade at best if those numbers are accurate.

Just wait and see...it will happen again while people will claim that the GA104 is NVIDIA's next "High End"...even though the 256 bit bus should tell peope that it's NIVIDA's mid range GPU.
I cant quote myself from 2 years ago:
I am waiting for BIG Pascal (GP102), no interest in medium pascal (GP104)...as I am upgrading from a BIG die (Gxxx0)

Funny how people only can read marketing names (GTX680, GTX 780 etc.) and not SKU names (GK104, GK110) and thus they sidegrade into mediums cores, just because it is new.

GPU from NVIDIA follow a simple path:

High end SKU's: GF100, GF110, GK110, GM200 -> GP102 (+500mm^2 dies, +256 bit Memory bus)
Mid range SKU's: GF104, GK104, GM104 -> GP104 (2-300 mm^2 dies, 256 bit memory bus)
Low range SKU's: GF106, GK106, GM107 -> GP106 (1-200 mm^2 dies, 128-196 memory bus)

Now two GPU's will stand out:
GTX 680 - GK104: NVIDIA was able to use their midrange SKU (GK104) to compete with AMD's high end GPU 7790 - Bonaire XT - GNC2

GTX 1080 - GP104: NVIDIA's midrange GPU (GP104) was able to beat AMD's midrange Polaris 10 SKU - GNC4 by a large margin.

But times was AMD dropping the ball, giving NVIDIA the option of makeing more profit form their mid range SKU's.

So to recap...this is a bleeding edgde midrange GPU.

I wish people start reading about the GPU, not just look at pretty PR letters on a box...

So this will look like this now:

High end SKU's: GF100, GF110, GK110, GM200, GP102 -> GA102 (+500mm^2 dies, +256 bit Memory bus)
Mid range SKU's: GF104, GK104, GM104, GP104 -> GA104 (2-300 mm^2 dies, 256 bit memory bus)
Low range SKU's: GF106, GK106, GM107, GP106 -> GA106 (1-200 mm^2 dies, 128-196 memory bus)
 
Like you my "new" rig is an i7-4770k I built five years ago, and is still going strong by upgrading GPU (from GTX 670 to 1070), add more SSD's along with some extra RAM. My "old" rig was from 2006 and had an AMD Opteron 170, 2.0 GHz CPU. Still rocking my Antec P180 case from 2006.

Instead of upgrading CPU/motherboard/RAM I "invested" (consumption, really) in an Eizo CS2730 monitor that has hardware calibration along with excellent luminance and color uniformity.

Actually I'm wrong. I forgot I build a new rig for my daughter last year with a i5-7500 & GTX 1060. (Had to go back & look at my Newegg purchase history.)

But then again computer technology moves so fast I might as well be a few years behind lol.
 
But then again computer technology moves so fast I might as well be a few years behind lol.
It used to move fast. The difference in performance between a new computer and one that was a year old used to be obvious.
But recently, you usually only notice the difference if you benchmark or push the bleeding edge.

SSDs were the last major performance boost for PCs, and you can retrofit them into older PCs and get that boost.

The thrill is gone, I'm afraid. Yes, AMD has caught up, that was nice. But still.
 
Do you need that massive FP64?

Who buys stuff because they "need" it anymore? That's so last century.

FP64 is not used or in any way benefical for gaming in any way, form or sort..
Infact it would make gaming performance suffer, by increasing TDP and die size (just like Tensor core)..so it would be counter productive.
 
Back
Top