NVIDIA GeForce GTX 1070 Founders Edition Preview @ [H]

Funny that he said don't bother buying it for 1080p gaming, get something less powerful \ expensive. Does that mean wait for a 1060? I dont think so! Maybe i am going to be at 1080p for a long time, but I'll have a card that can run with all the eye candy maxed out.
That's the plan here as well, unless AMD pulls a rabbit out of their hat in the next month.
 
But, do the majority of users here have 1440p displays?
I got a 27" 1440p korean display over two years ago for a whopping $230. The price of entry is not exactly high. Also every new card coming out will have the ability to absolutely smoke 1080p. Everything about a R9-380/GTX960 can smoke 1080p. So is it really meaningful whether you get 140fps or 127fps?
 
Read review at Tom's Hardware last night. Nah, it's either 1080 or bust for me. Good card though for the more budget conscious.
 
. Also every new card coming out will have the ability to absolutely smoke 1080p. Everything about a R9-380/GTX960 can smoke 1080p. So is it really meaningful whether you get 140fps or 127fps?

Smoke? Only if you want to play at 1080P with 4K-Like settings and framerates with a 980TI... I mean none of those cards, and even the more powerful cards in the market are still unable to max-out some games at 1080P.. im not one of those who buy a card then since day 1 start to turn-down/turn-off settings and be happy with that to enjoy constant >60FPS there's still a lot of 1080P users who play at this resolution due 120hz/144hz panels, and still for those of us who like high fps there's not enough performance in the market to upgrade to 1440P@120+hz..

So the SixFootDuo's request of 1080P its pretty valid.. I game with an OC'd 980TI at 1080P just to be able to game as closer to 120fps as possible... of course in light or non-too much demanding games I tend to use DSR up to 4K if 60FPS minimums are achievable as not every game require 120+fps to be enjoyable, probably I will upgrade to a high speed low latency 1440P monitor (probably ROG Swift as I have 2 and love it) if the next 1080TI or Titan offer the same performance jump like the 980 to 1080..
 
Performance Benchmarks - page 2 : NVIDIA GeForce GTX 1070 review: A Titan X at less than half the price

gtx-1070-hitman_0.jpg

Fury faster than 1070 ?
 
Great card but the Titan X thing reeks of hyped up marketing. It seems about equal, and I am sure an OCed Titan X would win out.
 
I will probably get a 1070 and see how it likes my raven case. Then my son can try to get SLI 970's working. We both game at 1920x1080, so this card should be overkill hopefully, and yet somehow use less overall system power.
 
Great card but the Titan X thing reeks of hyped up marketing. It seems about equal, and I am sure an OCed Titan X would win out.

It's a fact. It's big news. Insane performance for the price. Who cares what an OC'd one does - you can OC a 1070, too.
 
Great card but the Titan X thing reeks of hyped up marketing. It seems about equal, and I am sure an OCed Titan X would win out.

But then we get 1070s that can overclock and everything balances out. You still get Titan X performance for ~$400, everyone wins :D
 
The new SLI HB (High-Bandwidth) technology introduced with Pascal-based GTX 1080 and GTX 1070 graphics cards should theoretically double the bandwidth compared to older SLI technology on the Nvidia Maxwell-based graphics cards. Unfortunately, it appears that only 2-way SLI will be possible with SLI HB and those are the only SLI HB bridges that will be available.

with the new HB SLI tech, I wonder how micro-stutter and support will be handled.
 
HardOCP is my default tech web site. Appreciate the reviews, etc.

But, do the majority of users here have 1440p displays? I always have to go elsewhere to get the 1080p scores just so I have a better figure of what I can expect with any given video card performance.

Wonder if there is a way HardOCP can see the resolutions people are using while browsing just to get the majority of what's being used.

I've had a few 4k displays but sold them for profit after clients saw them. I've not gone back to 4k until video cards have the power to make 4k gaming silky smooth.

I'll be on 1080p for awhile I fear.

Here's the thing about 1080p, if you see that a game is "maxed out" at 1440p in our testing, you can guarantee it will be "maxed out" at 1080p as well.

1080p is going to be more CPU dependent and not show the true potential of these very fast video cards, sometimes 1440p isn't even enough. For example, we need to do 4K on the GTX 1080 to really see scaling, that review will be coming.

Some games perform so well at 1440p that at 1080p, it just doesn't show anything relevant. We also have to consider the price point of the video card. A $400 video card, or $700 video card warrants the use of a 1440p display. A person spending that much money isn't going to cheap out on a 1080p display. 1440p displays are quite affordable today.

Now, if there is a 1060 iine, that will probably be done with 1080p like we stuck with 1080p for the GTX 960 and Radeon R9 380/X GPUs. However, even if we do some 1080p, I feel a Pascal 1060 could possibly be viable at 1440p as well, we'd have to try it out and find out. In that case, both resolutions tested would make sense. But for 1070/1080, 1440p as a minimum makes makes the most sense. We unfortunately lack the time to test every resolution in every review, choices and focus have to be made. However, follow-ups can occur if needed.
 
We also have to consider the price point of the video card. A $400 video card, or $700 video card warrants the use of a 1440p display. A person spending that much money isn't going to cheap out on a 1080p display. 1440p displays are quite affordable today.

I can Argue this statement simply by saying....

My wife lets me do 1 major upgrade each year... I currently have a 1920x1200 Monitor, I would love to have a 4k monitor... so.. should I get a 4k monitor with my 7970k Amd card (that can't really push the pixels) or get a 1080 card and a 4k monitor next year....

BTW - I was thinking 2 1070's - then the 4k next year ;)
 
1080p is noted, but honestly you can look at our data and extrapolate what would be playable at 1080p quite easily.

The Division - No problem running max Ultra game settings with PCSS and 100% object detail and HBAO+ and High Reflections, basically the highest in-game settings minus HFTS shadows.
Fallout 4 - Maxed out game setttings, same as 1440p
The Wticher 3 - Maxed out game settings, High Post Processing, HBAO, Ultra global setting, 8X AA HairWorks
Rise of Tomb Raider - Max "Very High" manual settings, and possibly 2X SSAA, if not, then SMAA, same as 1440p
BF4 - Same highest settings, just a whole lot more performance, above 100FPS

1080p would allow the highest possible game settings across the board on GTX 1070.

To see the best scaling and difference of performance between GPUs we must test at a higher resolution, in this case 1440p for GTX 1070.
 
I can Argue this statement simply by saying....

My wife lets me do 1 major upgrade each year... I currently have a 1920x1200 Monitor, I would love to have a 4k monitor... so.. should I get a 4k monitor with my 7970k Amd card (that can't really push the pixels) or get a 1080 card and a 4k monitor next year....

BTW - I was thinking 2 1070's - then the 4k next year ;)

The sad fact is that your wife decides what you need to do :meh:
 
I assume your not married, or your about to get divorced if you think you can spend the family's money on just yourself, without talking to your wife about it... :p

Understand were not talking 50$ here or there , were talking 800$
Quite opposite, lol. I'm married, but with both of us working I see no problem. It wll comes down to how you manage your money. For example we're not drinking alcohol or eating out often (not even mentioning fast food junk), so it already saves us considerable money. I got all the hardware in my rig from signature partly last and partly this year and it was not a problem at all. What I was implying is it's like you're not allowed to make your own decision on how much you want and can afford to spend, but your wife regulates this. So no offence, but marriage is not a submission to wife ;) You need to adjust your expenses then. But that's already offtopic, so i'm done here :D
 
I'm in the same boat as you. I doubt I will ever go beyond 1440p in the next 5 years, unless high quality/highly recommended 4k monitors hit the sub-400 dollar mark. I currently game at 1080p. Trying to balance future proofing and a price I'm willing to pay for performance is a hard call to make. I want to step up to 1440p, this much i know. But i also don't want to drop 600 dollars on a 1080 and find out the Ti refresh makes it look like a paperweight with HDMI inputs.

My goal is to get VR ready (well, improved - technically I can run it now). So the 1080 is probably worth it. However, since I can't buy the VR for a while anyway I might just wait for the TI version or whatever comes out in 8 months or so.
 
Quite opposite, lol. I'm married, but with both of us working I see no problem. It wll comes down to how you manage your money. For example we're not drinking alcohol or eating out often (not even mentioning fast food junk), so it already saves us considerable money. I got all the hardware in my rig from signature partly last and partly this year and it was not a problem at all. What I was implying is it's like you're not allowed to make your own decision on how much you want and can afford to spend, but your wife regulates this. So no offence, but marriage is not a submission to wife ;) You need to adjust your expenses then. But that's already offtopic, so i'm done here :D

1st You dragged this OFF-Topic.

But, Because My wife and I "Together" decide on larger purchase.. that's Submission? Sounds like communication to me.
We balance our budget, spend what we can afford etc.

But *NOT* talking to your wife about spending money...... as I said...*your about to get divorced if you think you can spend the family's money on just yourself, without talking to your wife about it...*
 
The sad fact is that your wife decides what you need to do :meh:

lol, I was thinking that as well. I know a couple guys like that, and it's not even finances, but anyways :). If it was me, I'd get the video card this year and a monitor next year.
 
I assume your not married, or your about to get divorced if you think you can spend the family's money on just yourself, without talking to your wife about it... :p

Understand were not talking 50$ here or there , were talking 800$
Amen brother! I'm in the same boat you are in. The Mrs usually wants stuff like working washers, dryers, kids clothing, family vacations, etc. I get about one solid upgrade a year myself - so I feel your pain. Mine is probably closer to $500 though...I'm jealous of your $800.
 
Smoke? Only if you want to play at 1080P with 4K-Like settings and framerates with a 980TI... I mean none of those cards, and even the more powerful cards in the market are still unable to max-out some games at 1080P.. im not one of those who buy a card then since day 1 start to turn-down/turn-off settings and be happy with that to enjoy constant >60FPS there's still a lot of 1080P users who play at this resolution due 120hz/144hz panels, and still for those of us who like high fps there's not enough performance in the market to upgrade to 1440P@120+hz..

So the SixFootDuo's request of 1080P its pretty valid.. I game with an OC'd 980TI at 1080P just to be able to game as closer to 120fps as possible... of course in light or non-too much demanding games I tend to use DSR up to 4K if 60FPS minimums are achievable as not every game require 120+fps to be enjoyable, probably I will upgrade to a high speed low latency 1440P monitor (probably ROG Swift as I have 2 and love it) if the next 1080TI or Titan offer the same performance jump like the 980 to 1080..

Asking for 1080p reviews of high end cards is asinine, though. This is HardOCP, not MundaneOCP.

Besides, if you're really that serious about "smooth gaming" you'd have bought an adaptive sync display and the card to support it. Staying at 1080p "for the frame rate" is just stupid.
 
1080p is noted, but honestly you can look at our data and extrapolate what would be playable at 1080p quite easily.

The Division - No problem running max Ultra game settings with PCSS and 100% object detail and HBAO+ and High Reflections, basically the highest in-game settings minus HFTS shadows.
Fallout 4 - Maxed out game setttings, same as 1440p
The Wticher 3 - Maxed out game settings, High Post Processing, HBAO, Ultra global setting, 8X AA HairWorks
Rise of Tomb Raider - Max "Very High" manual settings, and possibly 2X SSAA, if not, then SMAA, same as 1440p
BF4 - Same highest settings, just a whole lot more performance, above 100FPS

1080p would allow the highest possible game settings across the board on GTX 1070.

To see the best scaling and difference of performance between GPUs we must test at a higher resolution, in this case 1440p for GTX 1070.
for some reason you seem to think that everyone targets 40-60 fps.
 
Asking for 1080p reviews of high end cards is asinine, though. This is HardOCP, not MundaneOCP.

Besides, if you're really that serious about "smooth gaming" you'd have bought an adaptive sync display and the card to support it. Staying at 1080p "for the frame rate" is just stupid.

What if you have a 144hz+ monitor? ASUS just announced a 180hz 1080p monitor.

1920x1080 is absolutely a resolution worth testing on high-end cards.
 
Higher than 144hz is silly, you're not going to get the same noticeable image quality improvement from 144hz to 180hz that you would from 60hz to 90+hz. IMO, even if you're after buttery smooth first person shooter performance, Once you hit the 120fps refresh breakpoint, increase the resolution. With higher res you can reduce some AA settings and still have great looking images if you still prefer 144hz
 
1st You dragged this OFF-Topic.

But, Because My wife and I "Together" decide on larger purchase.. that's Submission? Sounds like communication to me.
We balance our budget, spend what we can afford etc.

But *NOT* talking to your wife about spending money...... as I said...*your about to get divorced if you think you can spend the family's money on just yourself, without talking to your wife about it...*
you've totally missed my point. The way you've described it in original post doesn't look like you have conversation, but on contrary, your wife decides what you get cause it's her call amd you're just a tool or you can't make a right decision yourself. So stop taking it so personal and wishing me divorce only if you think that all the couples are like you guys and that would've been your case if you spent the money on monitor and card. I'm sorry for opening that shitstorm, was not my intention as I fell sorry. Again, maybe you need to review your expenses to get flexibility, just some advice.

Anyway, back to the topic. In all the reviews that I saw there were sometimes quite different results for this card. Sometimes abot 10-15 fps difference with seemingly same settings. Quite confusing!
 
One rig ( check sig...it's got a 970) running dual 1920x1200. I'd like to get a third... The 1080p numbers let me extrapolate. Upstream pix showing dx11/12 at various detail settings sure was helpful.
 
I take it you don't play modern games. Unless you're telling me a 960 can smoke my 970. It does do well, but the newest titles it can't get a smooth 60.
Typo, I meant to say everything above a 380/960 can smoke 1080p.
 
for some reason you seem to think that everyone targets 40-60 fps.

If fps is what you're after, just disable all eyecandy settings and enjoy, or lower your res. No point in having a top tier/2nd tier card if all you want is frame rates, unless you're trying to put off a video card purchase for another decade.
 
1080p is noted, but honestly you can look at our data and extrapolate what would be playable at 1080p quite easily.

The Division - No problem running max Ultra game settings with PCSS and 100% object detail and HBAO+ and High Reflections, basically the highest in-game settings minus HFTS shadows.
Fallout 4 - Maxed out game setttings, same as 1440p
The Wticher 3 - Maxed out game settings, High Post Processing, HBAO, Ultra global setting, 8X AA HairWorks
Rise of Tomb Raider - Max "Very High" manual settings, and possibly 2X SSAA, if not, then SMAA, same as 1440p
BF4 - Same highest settings, just a whole lot more performance, above 100FPS

1080p would allow the highest possible game settings across the board on GTX 1070.

To see the best scaling and difference of performance between GPUs we must test at a higher resolution, in this case 1440p for GTX 1070.

That's not really true. When the Fury X came out, if we only extrapolated the 4k results, the Fury X would have come on top in many games (which is what AMD probably wanted).

Going by steam stats 1440p is far from popular and its already loosing ground to 4k.

So to better serve your readers I think you should include 1080p results.
 
1920x1080 is absolutely a resolution worth testing on high-end cards.
Then your start your own review site that does that. The [H] thinks it is a waste of their time for their target audience, and it's a waste of a LOT of time, because they don't just run a canned benchmark and throw a number at you like so many sites: the [H] takes the time to do reviews correctly.

See, this is the thing: it's their site so they get to decide what to review and how. Don't like their choices? Well, there are plenty of other review sites, surely there is one somewhere that does reviews of what you want the way you want it done. Of course, some of them are incompetent, and some are shills just pandering for review samples.

Me, I'm only interested in how our soon-to-be-purchased GTX 1080s will perform when driving our new Acer Predator 32" UHD G-Sync monitors.
Our 970's can't drive those monitors at max frame rates with all the eye candy on, but they still look much nicer than our previous non-G-Sync 1080p (her) and 1440p (me) monitors.

G-Sync and Free Sync are pretty amazing, and UHD on a 32" monitor is like looking through a window. If you haven't experienced them together, you don't know what you are missing.
 
That's not really true. When the Fury X came out, if we only extrapolated the 4k results, the Fury X would have come on top in many games (which is what AMD probably wanted).

Going by steam stats 1440p is far from popular and its already loosing ground to 4k.

So to better serve your readers I think you should include 1080p results.

Hell yes. I don't have a 1440P monitor, TV, or projector and I'm never going to by one. When I upgrade, those will all go to 4K. 1440P is an intermediate resolution as far as I am concerned. So all I really want to see the data on is 1080P for all of the displays I have now and 4K for when I upgrade. And sure I know the cards getting 60+ FPS in 1440P means they can smoke 1080P, but I want to see the numbers if it gets to 120 or whatever in 1080P :D

I still dig the [H] reviews, just throwing my feedback in. The reviews would apply more to me if they included 1080P results.
 
That's not really true. When the Fury X came out, if we only extrapolated the 4k results, the Fury X would have come on top in many games (which is what AMD probably wanted).

Going by steam stats 1440p is far from popular and its already loosing ground to 4k.

So to better serve your readers I think you should include 1080p results.

If one only tested 4k, then yes, one could draw that conclusion from Fury X. But [H] also tested 1440p with the Fury X, so by extrapolating those two results to 1080p, one would conclude that Fury X performs better at higher resolutions but lag behind on the lower ones.

For most intents and purposes though, I just add 75% to 1440p's framerates to get a rough 1080p performance value, but generally speaking, Brent has a point: if it can be maxed out or be close to maxed out on 1440p, it has no problems with maxing it out on 1080p.
 
Back
Top