IdiotInCharge
NVIDIA SHILL
- Joined
- Jun 13, 2003
- Messages
- 14,675
It's still a surprise, albeit a pleasant one. Really looking to how well these cards scale to stupid resolutions across various arrays of displays.
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
4K eyefinity possible??
Nvidia still waves the PhysX flag in front of PC gamers but I have yet to play a game that used it for anything worth while, I don't know how important it will be in the future though - doesn't the PS4 and XBone have PhysX hardware acceleration? So might future console ports need it?
OpenGL still performs better on Nvidia, or is that just a rumor at this point?
PS4 and XB One have AMD hardware. Physx is an nVidia exclusive. There will be zero physx functionality. That's not to say they won't be capable of the same effects, it just won't be called Physx.
Nvidia still waves the PhysX flag in front of PC gamers but I have yet to play a game that used it for anything worth while,
Although I think Physx is a big joke. I thought I read awhile back the Nvidia will support Physx on the PS4?
Hmmm might have to research that one..been awhile
Edit: Yap I did read about it.
http://www.techradar.com/us/news/ga...ffer-physx-support-on-amd-powered-ps4-1136063
Being a POS made that card flop . The followup wasn't much better either, really.
But increasing the bus-width? They'd only do that if they were increasing the performance potential considerably, as that's expensive every way you look at it.
4K eyefinity possible??
Hmm, interesting... Had not read that before. Thanks for the link.
Although I think Physx is a big joke. I thought I read awhile back the Nvidia will support Physx on the PS4?
Hmmm might have to research that one..been awhile
Edit: Yap I did read about it.
http://www.techradar.com/us/news/ga...ffer-physx-support-on-amd-powered-ps4-1136063
That will likely be CPU based PhysX.
The thing i wonder is, Sony had not released any info on the CPU/GPU of the PS4 until the summer.
Nvidia made the announcement in March about the PS4. I wonder if Nvidia will still suppost Physx on the PS4.
And if they can support it on the PS4, I dont see why they cant support the Xbone.
Either way interesting times ahead!
You guys think the driver update for eyefinity crossfire frametimes will be in the 290x?
The thing i wonder is, Sony had not released any info on the CPU/GPU of the PS4 until the summer.
Nvidia made the announcement in March about the PS4. I wonder if Nvidia will still suppost Physx on the PS4.
And if they can support it on the PS4, I dont see why they cant support the Xbone.
Either way interesting times ahead!
If AMD was smart, they would put make a hardware frametimes built into the card like the 690 GTX. It worked so well for Nvidia, I think it would be a mistake not too.
That way they wouldn't have to keep updating drivers for certain games to work. It would work out of the box.
Anyway my .02c
if they did it would be CPU accelerated physX instead of GPU. which shouldn't be an issue for the ps4 or xbox to handle given that the xbox 360 had 0 issues with the cpu accelerated Havok engine in BFBC1/2, and BF3.
Or save die space as well as power consumption while allowing for larger DRAM capacities.
Because Dave Baumann stated that Pitcairn's PHY is ~50% the size of Tahiti's PHY, per 128b.How do you save space and power consumption by adding additional lanes to ram which require tons of transistors ?
So far we had 2900XT and GTX280 using 512 bit bus.
Neither of those were small chips and definitly neither of them can be desribed as low power consumption
Supposedly Havoc is working on OpenCL acceleration wich would alow it to function on both cards.
Should be interesting to see how gpgpu performances compare...
do reviewers already have these cards?
How long after big poopie event wednesday can we expect to see some reviews?
how many months before they are available at a decent price lol
Decent price appears to be right when they are released. If the performance is what the early leaks suggest it's a much better deal than the Titan and the 780
Better than a Titan??? Really? one or two FPS in any given benchmark, less RAM.
I'd say it can run as fast frame per frame as the Titan, but nothing else.
Lower price, yes.
Is this card "better" than a 780 is what I think you'd need to address.
It looks to me like they may be a little bit better, and if the card has 4 GB of RAM, then it's a good buy.
It doesn't "destroy" the Titan like I've been hearing, but it certainly can match a 780 and has more RAM if you believe these early leaked specs. But then it should, give the development cycle. The price IS better and that's nice.
I am talking about Nvidia's Physx not Havoc. I know Havoc can run on CPU's with no issues, but we all know Nvidia purposely doesnt want Physx (there Physx) to run on a CPU so they can say it can only be ran on Nvidia hardware.
Well what Nvidia hardware is in the PS4?......My point is I don't think Nvidia would allow Physx to run on the PS4 since it would prove that you dont need an NVidia card to run Physx (eventhough we already know it will)
Anyway
Does AMD make more money from a 300 dollar graphics card or a 600 dollar graphics card? (total unit profit, not per unit profit)
Depends on if they both come from the same die or not =P
Gotta love all these bullshit predictions every time a new graphics card is released. Funny thing is nearly every one is all wrong, (especially the ones with 'Leaked Benchmarks' in them).
It will be about the same speed as the GTX 780, but cost $1-200 less, just like always, forcing Nvidia to lower the price of their 780.
GG SSDD.
how do you know this? or are you just speculating?
As much as I don't want to downplay the speculated extraordinary performance of the R9-290X, there will be no card as legendary as the 9700 Pro. It's pretty much in its own tier. It completely wiped the floor. The innovation with that card was simply tremendous; it was the world's first chip to feature 8 pixel pipelines, as well as fully support AGP 8x bus standard. Under normal conditions it beat Nvidia's current offering at the time (Ti4600) by anywhere from 40-100% when AF/AA was enabled. Doublling the performance of the competitor's card would be godlike in this day and age.