wasteomind
Gawd
- Joined
- Aug 13, 2004
- Messages
- 522
So perhaps it has been asked already and I missed it, but are monitor overclocking and boost 2.0 hardware/titan exclusive features or will we see them rolled out in drivers/utilities?
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Common sense dictates that performance alone cannot be the only potential desirable attribute of a graphics card to all prospective buyers.Careful,instead of recognizing common sense,someone may accuse you of being to poor to afford one.
If your spending $1000 to use this as compute your cheap because if your making money off this card you would have already gotten a Tesla.
This is what kills me and this gets lost in pages of talking
580 - 384 bit - $500
680 - 256 bit - $500
Titan - 384 bit - $1000
Titan was suppose to be the 680, 680 was suppose to be the 660 ti
Premium building materials and 3gigs of RAM do NOT equal $500
This is a gouge, the smart people agree.
The people that have vested interests do not agree
The people that don't know the facts above have no valid opinion.
If your spending $1000 to use this as compute your cheap because if your making money off this card you would have already gotten a Tesla.
It kills me that people are pussyfooting around the facts, I 100% agree this is just some move by Nvidia to gouge the customer especially when they call it the "TITAN" and not a numbered SKU
They will likely continue to diverge the compute and gaming cards further while positioning the Titan as a middle-ground product. I doubt the 780 will compete with the Titan on GPGPU and compute performance.1000$ titan today so where does that put a 780 in a few months? even if its midway between 680-titan speeds, if its 5-600 bucks id be an angry titan owner lol
Some of you people trying to claim that the 680 was really supposed to be the 660 and the Titan should have been the 680 because of bus width are really, really grasping at straws.
Considering R600 is not looking to be an 8800-killer, it would seem that NVIDIA has miscalculated the 8800 Ultra product and only ended up marketing against themselves and looking somewhat foolish and arrogant with this overpriced product that seems to be little more than a PR part. Sorry NVIDIA, your Ultra is just not worth the money.
But, it sure does seem like flagship parts are now being raised into a new category.
You have absolutely no idea what you're talking about. The GPGPU applications I use do not need double precision or ECC and work with regular gaming cards. It would be a complete waste to spend 3K on a single K20X. A 1K Titan is much more reasonable. Just because I use the hardware to make money doesn't mean that I have an unlimited budget. You have to go with the best price/performance/feature ratio for your needs. I also have to contend with software licensing costs that run into the thousands which means that I need to pack as much compute power in a single workstation as possible. A triple Titan setup with 6GB for 3K would allow me to work on larger scenes (more geometry and textures) with less noise and power usage than a triple 690 3K. It would also cost 8K less than a triple K20X for 9K. That's 8K that can go towards other more important expenses. That's not called being cheap; that's called being smart with your money.If your spending $1000 to use this as compute your cheap because if your making money off this card you would have already gotten a Tesla.
You have absolutely no idea what you're talking about. The GPGPU applications I use do not need double precision or ECC and work with regular gaming cards. It would be a complete waste to spend 3K on a single K20X. A 1K Titan is much more reasonable. Just because I use the hardware to make money doesn't mean that I have an unlimited budget. You have to go with the best price/performance/feature ratio for your needs. I also have to contend with software licensing costs that run into the thousands which means that I need to pack as much compute power in a single workstation as possible. A triple Titan setup with 6GB for 3K would allow me to work on larger scenes (more geometry and textures) with less noise and power usage than a triple 690 3K. It would also cost 8K less than a triple K20X for 9K. That's 8K that can go towards other more important expenses.
1000$ titan today so where does that put a 780 in a few months? even if its midway between 680-titan speeds, if its 5-600 bucks id be an angry titan owner lol
Are [H] going to address the fact that they state they used "latest" caps with the13.2 beta drivers?
(12.11 CAP2 are the latest caps and ar NOT meant to be used with the current 13.2 beta drivers. They actually reduce performance in many of the games tested because the overwrite the later optimised profiles built into the drivers.
I was trying to illustrate that not all 'professional' compute users need DP as mt2e suggested that professionals should have bought the K20X months ago if they weren't being cheap.Titan is not artificially handicapped like the 580/480's were, All their compute performance is unlocked. See Anand's article for more details, moreover this card is also an entry level K20 compute card. You can actually go into the drivers and switch it to this mode. Only trade off is you loose clocks/boost clocks when you unlock the Tesla K20X like performance but you gain all the DP performance.
@Kyle and Brent - A favor to ask: Would it be possible for you to gauge CUDA performance by installing the free Octane v1.1 Demo here:
http://render.otoy.com/downloads.php
The benchmark scene 'Octane_Benchmark.ocs' can be found here:
http://render.otoy.com/downloads/OctaneRender_1_0_DemoSuite.zip
To start the bench, right click on the node called 'RenderTarget PT' and select 'Render'.
Underneath the image, on the left-hand corner should be a figure X.XX Ms/sec. For reference a GTX 680 averages 3.01 Ms/sec
Thanks!
^This.
Nvidia has raised the bar for the next gen cards with the Titan.
Nice review btw.
Maxwell is shaping up to be a beast looking at Nv GPU map.
If we see another, let's say GM104 (GTX 880) part for $500, and a GM110 part (Titan 2) for $1000, this Titan thing could become the norm.
The fact is that nobody outside Nvidia knows why things happened the way they did or what was originally intended.
I'd wager than a decent amount of people on the fab side of things know a bit too (or at least can put 2+2 together).
This is what kills me and this gets lost in pages of talking
580 - 384 bit - $500
680 - 256 bit - $500
Titan - 384 bit - $1000
Titan was suppose to be the 680, 680 was suppose to be the 660 ti
theirs many of us who never have issues with sli.
Hopefully there will be competition vs. the GM110 parts, so we'll see lower prices on the high end parts. Looking forward to see what Maxwell will do though.
I honestly can't believe that mistake was made tbh.
AMD doesn't build big die chips. I doubt we'll see anything. AMD hopes to kill it with crossfire.
I wish you were around yesterday and said THIS
rinaldo00 Limp Gawd, 8.0 Years Status:
Quote:
Originally Posted by Zarathustra[H]
That is a good question.
All the comparisons to multi-GPU solutions are just silly. Granted Nvidia's SLI is better than AMD's CFX, but they still come off as being public beta tests, unstable, problems, and never as fluid as a good single GPU solution.
I have run SLI for over 5 years without any problems, what am I doing wrong?
then heard this
cannondale06 [H]ardForum Junkie, 5.2 Years Status:
Quote:
Originally Posted by Skakruk
Well the first thing you're 'doing wrong' is being immune to microstutter. You're very lucky.
and claiming no problems is simply not true. there is zero chance that he can run sli for 5 years and not have issues at some point. its like the people that say they have never had a driver issue ever. it is just nonsense and selective memory.
I knew HardOCP would give this a gold award. They always seem to have a boner for expensive and impractical hardware.
I knew HardOCP would give this a gold award. They always seem to have a boner for expensive and impractical hardware.
There shouldn't be $1000 cards in the first place
Who else thinks this Titan release makes way for a 780 on release with a price tag of $800?
I mean.. If they can get $1000+ for this card whats stopping them?
Everyone praises the Titan because it is great enough that we can finally do away with SLI problems.....then in the next sentence they say "I WANT 2 TITANS IN SLI"
Is that sane?
Question, what happens to the review cards?
Titan was suppose to be the 680, 680 was suppose to be the 660 ti
It kills me that people are pussyfooting around the facts, I 100% agree this is just some move by Nvidia to gouge the customer especially when they call it the "TITAN" and not a numbered SKU
So perhaps it has been asked already and I missed it, but are monitor overclocking and boost 2.0 hardware/titan exclusive features or will we see them rolled out in drivers/utilities?
Well yeah but you know what I meant... guys writing reviews at tech sites and people following things and posting on tech sites don't really know what's up. All we can do, and all the reviewers here can do, is go for a best guess approach.
I swear I've seen [H] reviews trash products for being too expensive, though. I just don't see the value in this, outside of MAYBE SFF...not really even then TBH.