Separate names with a comma.
Discussion in 'HardForum Tech News' started by AlphaAtlas, Sep 25, 2018.
More like 1070Ti Used for Mining Edition for $300
More like 1080Ti used for $500 plus an extra hundred dollars to buy games or a 2070 for $650+ that gets its face stomped by the 1080Ti.
Love the price. I am going to lose an arm to replace my 1070.... (sarcasm intended)
indeed, I ran a 5770 I think I paid around $140 for. 6-12 months later I added a second in crossfire just because I wanted to try it. That setup was able to run anything at 1080P 60FPS as long as anti-aliasing was off, due to it's very slow memory. Served me well for 3 or 4 years.
I understand how excessively demanding 4k and 1440/144hz is, but there should still be an entry level 1080P screamer like that 5770 was in 2009-2010 that can handle everything at 1080p medium for ~$150.
You can get a Radeon RX 570 4GB for $159.99 after rebate right now. That's probably the closest we're going to get to an affordable 1080p/medium quality card.
What is it considered a paper launch?
It's not, though, not really. The 2080 is 10% faster than an FE 1080 Ti, but so is nearly every AIB 1080 Ti. The FE was constrained by a poor heatsink, which they got rid of for the 20xx series. You put the 2080 against any of the non-blower fan 1080 Ti cards and they're neck to neck.
What's crazy is that's what I paid for my RX 480 4GB - 2 years ago! And not just any RX 480, an MSI Gaming X one. The mining craze inflated prices like crazy, and they're only now getting back to where they were two years ago.
And then Nvidia released the 20xx series, because apparently the prices weren't inflated enough.
I'm in a bit different boat then most waiting for the 2070.. I just bought my 1080 new from EVGA so this will be out while I still qualify for step up.. if the performance is on par and I'm out 150 or less to get the new features that I can't actually use yet I'll most likely snag one. I'm willing to pay for the future proofing as long as it's not a fortune.. plus I only really okay one game and per the official release from Nvidia dlss makes a big improvement there so it should be an upgrade for me either way once the dlss drivers are released for pubg.
Looking into my crystal ball... I see a gain of 0 performance, 1 ray tracing that's too slow for maxed settings at 1080p, all the DLSS that FF 15 can handle, and you will feel like a great weight has been lifted... from your wallet.
Wait the RTX 2070 has DVI? What the fuck! I have two 144hz monitors that only takes DVI, but I want the 2080ti... ffs!
Kind of funny. Just a few days ago the GTX1080Ti wouldn't be considered a great buy based on a price/performance. But now it looks like the best value compared to Turing.
No bad cards, just bad prices.
That being the case, then the RTX 2070 will perform well below the 1080Ti
its odd they are releasing the top 3 cards at the same time
They release all 3 now because 7nm will be out next year. This allows Nvidia to milk more dollars up front before the 20 series is obsolete in 12 months.
That's my assumption right now. It will perform worse than the years old Pascal cards it is replacing at a higher price as well. Nvidia done lost their fking minds and I'm just sitting here with a 5ghz i5-8600k in my new build from late last year and still still sitting on a POS rx480 STILL waiting to upgrade. Just too pissed off at nvidia and seriously thinking about buying a Vega just to keep holding me over. Vega should do fine on my 21:9 1440p freesync monitor anyway. Was hoping to buy a 2080 but my god the price and the performance? Turing is just garbage right now.
I may not do anything I'm so used to just waiting due to the mining bubble over the last year. Still find it hilarious that many if us were around pointing out that it was a huge irrational bubble and hey guess what? Plus I'm literally in the hospital right now with my newest son, who will be one day old in two hours.
Seriously though Turing is the worst, stupidest, most let down video card release I've ever seen. Worse even than your typical amd release.
My hunch, based on nothing but my hunch, is it'll be between a 1080 and 1080ti in performance. But my hunch also tells me the ray tracing with the 2070 will be too weak to do anything except being a marketing gimmick, so you're left with an overpriced card.
You must not have been in the game too long, if you think these are the worst let-downs.
How about the FX-5800
Most S3 chips. (granted, we didn't expect TOO much)
Voodoo 4/5 series
Many Power VR chips (they could have been so much more)
i740? (it wasn't absolutely horrible, but not what it was supposed to be)
GeForce 256 SDR? This was a big advancement, but like Turing, it introduced features that it couldn't initially keep up with at the time. The GeForce 2 GTS/Ultra were what really brought T&L forward.
Those are a few that spring to mind. There have been some seriously disappointing chips released over the years though. These GTX 20xx cards are actually very good cards. The fastest on the market. They're just dividing the user base though due to features and performance in older games. They're still the fastest thing you can buy.
Can't wait to preorder this to replace my 970!
The RTX 2080 is actually 5% slower than the 1080 Ti.
It's game dependent.
Matrox Mistake, some games looked worse than software with their horrible texture filtering, still people swore by them for some reason, impossible to get them to realize there were cards that mopped the floor with the Mistake.
S3, known for the worlds first hardware DEcellerator cards, ran better in software mode. Hey look Jimmy, you TOO can have horrible frame rates after paying half as much as a real 3D accelerator would run you! (TM)
Voodoo 4 was slow, Voodoo 5 was OK, but yes it did lose out to the GeForce 2, still it was a quite good card for Glide.
GeForce 256, get DDR or forget it. The DDR version was about twice as fast. SDR went mostly into OEM systems, where they conveniently left out or hid in the fine print that you got the SDR version.
when are actual ray-tracing benchmarks going to be released?...do we need to wait for Battlefield or will Shadow of the Tomb Raider get patched soon?
Hmm, the more you buy the more you save doesn't sound so enticing now does it? No one knows. It was one of the first things to pop into my mind when watching the reveal, no dates for any RT titles anywhere to be found, so that was a big reason I chose to hold off. The new Metro is supposed to be out around Feb. '19. At least 1 RT title at launch would have made the card much more enticing.
I think this will be the best buy for the Turing lineup, which isnt saying alot. Whereas the 2080 was neck and neck with the 1080ti (although with less memory), the 2070 should perform well above the 1080, especially in newer dx12 titles, due to keeping the same bandwidth as the 2080.
The 2070 looks to be more like a 2070ti than a 2070.
after the w10 october update.
There are HDMI or DisplayPort to DVI cables out there. I use 'em all the time on Windows, Mac's and especially to connect Raspberry Pi's to older monitors.
If you have a FreeSync monitor, why is that even a question? Get the Vega, love it, be happy!
Also, Turing kind of reminds me of Tesselation when it first launched. Everyone was Tesselation this and Tesselation that, and we're going to Tesselate the shit out of everything! And then it turned out that tesselation was too taxing for the GPUs, and everyone turned it off because the performance hit wasn't worth the minor improvement in image quality.
Oh, and most importantly, congrats on your baby boy!
Not sure why you guys are guessing at the 2070 performance. It’ll be ~21% slower than the 2080. So slightly ahead of the 1080 and when/where DLSS is applicable ahead of the 2080ti.
Just take ~21% off the 2080 reviews and bam, you have 2070 performance....
I just bought a BNIB 1080ti ftw3 for 540. I hope I wont regret it at 5760x1080
Sadly it doesn't support high refresh rate monitors.
You'd need to run an active DP to DVI-D (dual link) adapter for each display.
Caveat: Most of these adapters seem to be kind of hit and miss. You may be limited to 120Hz or live with the occasional flicker or monitor power off/on weirdness, from what I've read.
There's a post with other related info at Linus Tech Tips that might be handy, but the Dell Bizlink above is the least expensive adapter I came across in my admittedly brief search.
And a relevant Reddit post.
The other option, of course, is to sell your displays and pick up a set that feature Displayport. The adapters are probably worth a shot, though.
Great release timing. $499? Great! When Black Friday rolls around and the prices are slashed*, THAT'S when it'll be time to pick one up.
*slashed to $498.99. Sigh. But then again, Nvidia is evil. Screw 'em.
Should be about perfect. My 3440x1440 is very close on MP and a 1080ti pushed it very well. The extra VRAM probably doesn’t hurt either.
I would have skipped this gen if it wasn’t for VR.
What the fuck is a regular version?
We used to call them reference now they are founders editions and whatever the fuck amd calls them.
And a price premium for a bone stock card too.
I read everything about the active adapters a year ago and as you said, it really is a hit and miss. I have 3 144hz monitors and one 165hz monitor and two of them are DVI only, so I can still use my 2080ti with Displayport, but I always prefer DVI when given the choice as it just seems more stable most of the time. What I'm pissed about is the fact that the RTX 2070 has native DVI, but the 2080/2080ti lacks it for no apparent reason. I think there is only one custom PCB out there, so eventually the DVI could come back with custom AIB cards.
Could this be the most pointless graphics card ever released? I feel sorry for the mugs that are going to spend cash on this.
Oh man, why did I decide to look at the prices? Now I am sad. I am completely broke and I can't stand HD 4600 gaming anymore... Intel HD 4600. LOL.
And this? Oh come on!
That sucks, man. Sorry!