Separate names with a comma.
Discussion in 'HardForum Tech News' started by FrgMstr, Aug 20, 2018.
NVIDIA RTX 2080 - 2080 Ti - 2070 Full Specifications Revealed.
I'm down for a 2080Ti - (non founders), $100 more than I was anticipating... but hey... bleeding edge, zero competition.
Will wait till something pops up on Newegg though -- no rush and I'll save $80 in taxes
anyone found any info on nvlink multi-gpu support? I haven't seen anything mentioned other than for Quadros.
*edit* it's not mentioned in the specs, but they are selling nvlink-sli bridges on the same page as the pre-order for $80.
I'm not familiar with NVlink at all -- is this somehow magically different than the SLI connectors on a system level. I heard him mention something about how it's seen as "one big GPU". Would this mean running dual cards wouldn't run into the MANY issues SLI sees today? As it stands running more than one GPU for gaming is pretty pointless from any angle it seems. Wondering if NVlink would somehow change this.
That would be nice, since DX12 basically changes the way SLI works. This would mean SLI could continue to work without requiring game developers to actually make it work for each game (which will never actually happen). Previously, with DX11 and before, it was done at the driver level, so it could be done with multiple independent GPUs.
I wonder what the cost to core ration is comparing the 1080 ti vs 2080
Well, to my knowledge, there has been no talk about the features of nvlink sli. nvlink on the quadro line of cards allows gpu's to directly access each others vram. This effectively allows the cards to pool their vram, whereas in sli, all cards needed to keep a local copy for their gpu. As compute workloads are very paralel and often not dependant on each other, this effectively allows the system to behave like one giant gpu.
Realtime graphics workloads are also very parallel, but also need to remain in sync to be useful, so unless they have some more unannounced tech, may not benefit from the features of nvlink. NVlink does have a lot of bandwidth, it would be awesome to see some unannounced sli gpu pooling that lets api's treat the cards as one big one.
I think this is once again a branding issue with Nvidia.
the actual NVLink bridges were going for $500 to $700 a piece and Nvidia just upgraded their basic SLI bridges with Pascal.
I think they are just throwing the NVlink name on the same bridges used with Pascal.
Leaked photos of the cards/pcb clearly show an nvlink connector on the cards, so at worst, it's actualy sli, but running over nvlink connectors.
I'm interested in the RTX 2080, 215 Watts it needs, but it doesn't say how many watts idle? Is that 215 watts under load? I have a EVGA 850w platinum, (it should be more than enough I hope) and looking to build a new pc with a 2700x + RTX 2080
I think it will come down to the driver features. SLI will benefit from the increased bandwidth, no doubt.
NVIDIA's TDP ratings are under full load, represent total board power, and are always pretty spot on. My Titan X pulls around 240-245W at its peak overclocked to about 2GHz with Optimal power set in the drivers.
850W is definitely more than enough. You could probably run such a system on a 600W PSU.
Oh thanks a lot, good to hear. 215 watts under load is good then. You have a Titan X way more powerful than my GTX 1080 I'm good to go.
Wow, huge jump on the core count from the 2080 to 2080 Ti!
At least for now. Remember the Supreme Court last month "struck down" the 1992 ruling that allowed much of the e-commerce to be "tax free" in practice. All 50 states are currently working on passing legislation to start taxing ALL internet purchases. Few things move faster through legislation (with bi-partisan support) than bills offering new untapped revenue $treams.
Sadly, we are in the final days of tax free shopping on the internet.
I don't what state you live in, but a lot of states actually require you to report sales tax on online purchases that were not collected at time of transaction. Still, not a lot of people do it, but there is always a possibility that they will go after you in the future. I think Louisiana recently did.
So the question becomes at what speed will these actually run at?
The last two or three generations on Nvidia cards tended to run at least a bit in excess of their stated boost frequency as long as the fans were allowed to spin. Now that even the reference cards are a dual fan design rather than blower style (and indicating a further lack of support for multi-gpu configurations) the card should be able to run a bit cooler and with a bit more boost than before... or at least a bit more boost than if it was a blower fan.
True, but virtually no one reports it on their tax return. IIRC, less than .5% of all non vendor collected internet sales tax was reported to the IRS last year.
Between "forced" sales tax collection at the vendor level on virtually all internet purchases AND the 25% tariff on all Chinese made tech, PC hardware is about to get MUCH more expensive imminently.
Yeah, 48% more cores and 38% more memory and memory bandwidth. Not too bad for the 50% price premium.
Surprising the relatively low core count of the 2080 vs. the 1080 Ti (22% advantage). Considering how well the core and memory overclock on the 1080 Ti, Turing had better have a significant performance per core advantage and/or they better overclock like crazy.
Well, they said the SM array was entirely new, so we may not be able to directly compare shader performance between the two.
is the leaked picture of a 2080ti or of a Quaddro?
I think the Pascal Quaddro's had NVLink while obviously the Titan's did not so a leaked picture like that wouldn't be the best thing to go off of.
I hope you are right though, might think about sticking with SLI this time if that's the case.
I would agree, but more for the fact that the launch prices appear to be absurd, so they would have to do everything short of folding my laundry to make me want to pay those prices!
so did they ever say why they skipped 11xx series naming? and I suppose RTX replaces GTX moving forward?
as a 1080ti owner i will be skipping the 20XX series, the price tag is not the 20-30% performance. BTW with the 2080TIi you will get 10 more fps in EFT
I doubt these cards will overclock as consistently well as Pascal did. The 2070 has a listed base clock higher than any Pascal chip did but as you go up the clocks keep going down up to the 2080ti being a couple hundred mhz slower than any pascal card was at base.
I can't imagine there being a 400+ mhz difference in base clock speeds between the two if there wasn't a significant amount of silicon lottery aspects to it.
On the plus side there might be some real value in holding out for the most heavily binned 2080ti's like the Lightning with better parts on the PCB.
Guess we will be finding out.
I bought two Ti and two 2080.
Well, 768 more cores, 21% increase. 1635MHz clock on the FE card, but Tom Petersen was saying it was a monster overclocker and he usually does not lite. I would guess with better mem bandwidth, possibly 25% faster in GPU heavy non-RTX games. Just a guess.
You have to request it, and can only do so when someone has tried to Impersonate you.
Ah. So Kile Binnit hasn't tried to take it yet.
I had my 7970 card until the 980 ti came out.. Looks like I'll be holding onto my 1080 ti.
I'm still sucking hind teat with a single GTX 960 4GB. I have been holding off for the new generation and will be waiting until I see Kyle's reviews and hoping that NV-Link will work with the RTX cards like it does with the Quadro cards. If it does then 2 RTX 2070's will give me 16 GB of memory and 4608 cuda cores for the cost of a RTX 2080, I hope. Need the memory & cores for Digital rendering of NSFW Sci-fi comic images. Don't get to do much gaming any more so it's cheap copies of Barbarella and Galaxena style comics for me.
Any news on when the reviews will be out. I mean almost all tech site seems to have not questioned that. LOL! No one is even saying hey different times now. I haven't even seen a date for review, lol!
Reckon I'll be hanging on to my 780ti a bit longer, because my plan on getting a "good deal" on a new 1080ti probably ain't gonna happen, since there's no downward price pressure from the new generation.
I was hoping for something like that too....
Way too early to draw meaningful conclusions but at 1080p this can't be good:
It's kind of full circle. In the past I was on the side of the fence where I was worried about quality over fps. At this point in 4k/ultra I'm happy with the quality and I want more fps than I do more bells and whistles. Many have commented here and on different threads and sites about this gen being about new architecture/features and not performance boost. Really be glad when some reviews come out and explore the compromises being made for ray tracing vs. performance.
Can't read that article, but PCGH points out a few important things such as the implementation still being in development and that ray tracing won't be ready when the game is released.
Both links have mostly the same info. To be fair the newer TR's have always been card crushers at highest settings. Just a bit unsettling to see a new TI at 1080p get hit this hard. Interesting how NVlink is also prominently listed in specs on all sites but as we all know SLI is more like the undead these days. Maybe someone should tell the dev's it might be the only way anyone will be able to play these new AAA games when raytracing is fully implemented. Fortunately Squae Enix have supported it in all the TR's so far.
Is there an estimate of folding power? Or we just have to wait?
Does anyone else get the impression of smoke and mirrors with the "RTX" 2080ti? This is probably the most hyped launch that I can remember since the Ti4600... and that thing was a turd for the money. This makes me feel the same way that I did when people pre-purchased Tesla Model 3's 18-months prior to promised delivery (and most of those were late). Then, the Model 3 had the fit and finish of a 90s Kia (no thanks, not putting my kids in that).
People would do well to wait until real reviews appear instead of trying to be "FIRST ZOMG LOLZ".