- Joined
- May 18, 1997
- Messages
- 54,911
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature currently requires accessing the site using the built-in Safari browser.
anyone found any info on nvlink multi-gpu support? I haven't seen anything mentioned other than for Quadros.
I'm not familiar with NVlink at all -- is this somehow magically different than the SLI connectors on a system level. I heard him mention something about how it's seen as "one big GPU". Would this mean running dual cards wouldn't run into the MANY issues SLI sees today? As it stands running more than one GPU for gaming is pretty pointless from any angle it seems. Wondering if NVlink would somehow change this.
I'm not familiar with NVlink at all -- is this somehow magically different than the SLI connectors on a system level. I heard him mention something about how it's seen as "one big GPU". Would this mean running dual cards wouldn't run into the MANY issues SLI sees today? As it stands running more than one GPU for gaming is pretty pointless from any angle it seems. Wondering if NVlink would somehow change this.
I think this is once again a branding issue with Nvidia.
the actual NVLink bridges were going for $500 to $700 a piece and Nvidia just upgraded their basic SLI bridges with Pascal.
I think they are just throwing the NVlink name on the same bridges used with Pascal.
I think it will come down to the driver features. SLI will benefit from the increased bandwidth, no doubt.Leaked photos of the cards/pcb clearly show an nvlink connector on the cards, so at worst, it's actualy sli, but running over nvlink connectors.
NVIDIA's TDP ratings are under full load, represent total board power, and are always pretty spot on. My Titan X pulls around 240-245W at its peak overclocked to about 2GHz with Optimal power set in the drivers.I'm interested in the RTX 2080, 215 Watts it needs, but it doesn't say how many watts idle? Is that 215 watts under load? I have a EVGA 850w platinum, (it should be more than enough I hope) and looking to build a new pc with a 2700x + RTX 2080
I think it will come down to the driver features. SLI will benefit from the increased bandwidth, no doubt.
NVIDIA's TDP ratings are under full load, represent total board power, and are always pretty spot on. My Titan X pulls around 240-245W at its peak overclocked to about 2GHz with Optimal power set in the drivers.
850W is definitely more than enough. You could probably run such a system on a 600W PSU.
I'm down for a 2080Ti - (non founders), $100 more than I was anticipating... but hey... bleeding edge, zero competition.
Will wait till something pops up on Newegg though -- no rush and I'll save $80 in taxes![]()
I don't what state you live in, but a lot of states actually require you to report sales tax on online purchases that were not collected at time of transaction. Still, not a lot of people do it, but there is always a possibility that they will go after you in the future. I think Louisiana recently did.I'm down for a 2080Ti - (non founders), $100 more than I was anticipating... but hey... bleeding edge, zero competition.
Will wait till something pops up on Newegg though -- no rush and I'll save $80 in taxes![]()
I don't what state you live in, but a lot of states actually require you to report sales tax on online purchases that were not collected at time of transaction. Still, not a lot of people do it, but there is always a possibility that they will go after you in the future. I think Louisiana recently did.
Wow, huge jump on the core count from the 2080 to 2080 Ti!
Well, they said the SM array was entirely new, so we may not be able to directly compare shader performance between the two.Yeah, 48% more cores and 38% more memory and memory bandwidth. Not too bad for the 50% price premium.
Surprising the relatively low core count of the 2080 vs. the 1080 Ti (22% advantage). Considering how well the core and memory overclock on the 1080 Ti, Turing had better have a significant performance per core advantage and/or they better overclock like crazy.
Leaked photos of the cards/pcb clearly show an nvlink connector on the cards, so at worst, it's actualy sli, but running over nvlink connectors.
Yeah, 48% more cores and 38% more memory and memory bandwidth. Not too bad for the 50% price premium.
Surprising the relatively low core count of the 2080 vs. the 1080 Ti (22% advantage). Considering how well the core and memory overclock on the 1080 Ti, Turing had better have a significant performance per core advantage and/or they better overclock like crazy.
Yeah, 48% more cores and 38% more memory and memory bandwidth. Not too bad for the 50% price premium.
Surprising the relatively low core count of the 2080 vs. the 1080 Ti (22% advantage). Considering how well the core and memory overclock on the 1080 Ti, Turing had better have a significant performance per core advantage and/or they better overclock like crazy.
I bought two Ti and two 2080.Would you mind sharing which ones you purchased? And I think Ill hold off on buying one until I get an [H] review. Looking forward to seeing what it can handle with current content.
Well, 768 more cores, 21% increase. 1635MHz clock on the FE card, but Tom Petersen was saying it was a monster overclocker and he usually does not lite. I would guess with better mem bandwidth, possibly 25% faster in GPU heavy non-RTX games. Just a guess.Can't wait for reviews! Although I will have to admit, really odd they showed no performance numbers.....Makes ya wonder
I dunno.Why doesn't the H twitter account have a verified checkmark next to it?
Why doesn't the H twitter account have a verified checkmark next to it?
Ah. So Kile Binnit hasn't tried to take it yet.You have to request it, and can only do so when someone has tried to Impersonate you.
Reckon I'll be hanging on to my 780ti a bit longer, because my plan on getting a "good deal" on a new 1080ti probably ain't gonna happen, since there's no downward price pressure from the new generation.
Can't read that article, but PCGH points out a few important things such as the implementation still being in development and that ray tracing won't be ready when the game is released.Way too early to draw meaningful conclusions but at 1080p this can't be good:
https://www.dsogaming.com/news/nvid...-with-60fps-at-1080p-with-rtx-on/#more-116188