Separate names with a comma.
Discussion in 'nVidia Flavor' started by Blade-Runner, Sep 22, 2015.
Hopefully this will be the lethal blow that is needed to terminate AMD.
Why on Earth would you say that? Nvidia's drivers are so poor right now with AMD as weak as they are. With no competition I really don't see how Nvidia could not continue to slide down the slope to the abyss that they are on. Great hardware means nothing when your code monkeys are flunk outs from Devry...
Ya know, I used to love AMD. They made great cards and had good drivers.
But now, their cards do not have a RAMDAC or a DVI-TMDS transmitter. The fastest AMD card with a DAC is the 7970/7990. The fastest AMD cards with TMDS-DVI is the 290X/295X2. Missing a RAMDAC means that the cards cannot be used with analog CRTs. Missing a DVI-TMDS transmitter means that the cards cannot be used with digital LCDs that use DVI. Basically, Furies and Fury Xs cannot be used with any monitor that isn't a crappy, pre-built LCD that has DisplayPort or HDMI.
Adapters cannot convert digital to analog and get a usable signal (max is something like 1600x1200@60Hz). Adapters also cannot convert DP or HDMI to DVI with pixel clocks above 330. A pixel clock of 330 is totally usable, but you aren't going to get over 2560x1440@80Hz or 1920x1080@144.
Why would you want that?
Wonderful. Then we get to buy way overpriced GPUs with little innovation, just like Intel is doing to us on the CPU side since AMD dropped the ball.
Are you actually this dense??
Stop posting when drunk.
You have never been able to get as much CPU for your dollars as you can today, despite AMD's irrelevance on the CPU market the last couple of years...
when is expected to be released?
This is one of the most epic troll post I've seen.
I REALLLY REALLLLY Need all my games to run at 144 FPS or I'll dieeeeee mom!
Just had to throw that blow in there at nvidia huh?
Not sure what problems you're having with their drivers, but they've been great for me and all the games i've played on Windows 10
BRB building myself a custom LCD.
With VGA and DVI.
Let's see some leaked photos.
Grab another card with some 3D surround mixed in and tell me how you feel then
People with single cards and vanilla setups will rarely feel the pain of driver issues on either platform.
Dear Lisa Su,
Was kind of hoping we can avoid the nVidia vs AMD thing. Too late
Anyway I'm looking forward to this. I might even go back to SLi if they released a dual GPU card similar to GTX 295 (with a reasonable pricing, and not the Titan Z's retarded pricing)
Yeah that 80% market share really sucks. No doubt they are going down.
So someone else takes it place
I wonder if it will be on store shelves around the same time as DOOM.
In before S3 Savage3D or Matrox coming back...
Yeah and then you can pay even more for Nvidia hardware that doesn't improve as much every year.
Cool...I've been really anxious to see how the die shrink affects power and heat for upcoming Pascal and Pirate Islands chips.
Wait, they're not?
People generally buy monitors with a high refresh rate so that they can use it at a high refresh rate.
You would expect people to do what, run their 144Hz monitor at 60Hz and act like it's no big deal?
It's obvious, he's either an NVDA or INTC shareholder.
With Jen-Hsun being an Apple admirer, I bet he'd love to be able to charge us all 65% margins on all their cards like Apple fans do for their iCrap. We would all be paying workstation class prices for midrange parts.
Must be, or see my 1st response.
S3? Fuck no. Matrox? Sure, why not?
If you are that concerned about ramdacs, you can still use nvidia cards to keep those ancient dated crt displays running. But honestly, it's like complaining about a carriage company that stopped building carriage parts in favor of switching over to car parts. Stick with the ancient past if you must, but that is a dead end. Deal with it and stick with nvidia if you need to live in the gutter level past.
I wonder what the chances are that they'll stick with the Feb/Mar release for the new Titan....
Probably pretty good if HBM 2 is already in test pilot phase and slated for mass production in 2016. It makes sense from a quantity point of view to release Titan first since early samples will be limited and they can charge a premium for it and get great PR for Pascal. This time I don't think we'll see a 980 Ti equivalent so quickly unless AMD surprises us.
Anyone other than AMD. Heck they don't even have to buy it.
The market is there and its becoming more attractive with all the 4k and VR stuff. Someone else will want a piece of that pie.
The latest mobile GPUs are becoming quite strong. Now I'm not saying PowerVR could make a titan killer tomorrow but something that can take upon the GTX960 would be a good start.
Like we do with AMD right now...
Exactly. Who the fuck would take its place? Intel's avoided that market because it failed at it. Everyone else who once sold products in this market has long since been absorbed or died off. There is more chance of someone stepping up to the plate to challenge Intel in the CPU market than there is a chance at a rival coming forward to face NVIDIA in the wake of AMD's demise.
So I'm guessing GP104, then GP110 later, as per usual? Or is this HBM2 stuff going to throw off the cycle?
Someone buying AMD's graphics business isn't necessarily going to be a good thing. Whatever company that comes along to do this has to have some idea of how to manage the business and make better decisions than ATi/AMD made. While this is certainly possible it isn't a sure bet. Plenty of companies have tried to buy a company for it's IP and move forward in that business only to fail. National Semi-Conductor purchased Cyrix which ultimately led to nothing. A similar scenario is just as plausible as the reverse. Personally I'd rather see AMD recover and continue on.
Modern game running at 144fps full details please, 1080p There's a reason gsync and freesync exist.
Intel has boat loads of competition from the bottom up in ARM which is where the real fight is now days.
Terrible idea. Unless you want to pay $1000 for mid-range cards in the future.
Nvidia could have easily priced the GTX 970 at $400-$500 but they didn't.
The 290X was around $500 last year. The market as we know it today is all thanks to Maxwell.
Despite popular opinion, Nvidia actually wants people to BUY their GPUs. If a new mid-range part is $1,000 most people will decide to just keep their old card for another generation. I imagine it's a complex dance of supply/demand and profit margins, so it's not as simple as "More expensive = More profit hurr durr".
Don't they have a supercomputer order they have to fill first, like they did with Kepler?