![]() |
|
#1
|
|||
|
|||
|
AMD is not allowed to see the DX11 code for Watch Dogs
__________________
StinkyMojo ~ http://www.heatware.com/eval.php?id=32642 Intel i7 2600 @ 3.9GHz * MSI 290X Gaming * 8GB G.SKILL Sniper 1.25v Lite-On 256GB SSD * Samsung 1.5TB HD * SeaSonic M12II 520w * NZXT Vulcan LG 29UM65 29" UltraWide * Das Model S Ultimate (MX blues) * Logitech g400s |
|
#2
|
|||
|
|||
|
So just like what AMD did with Mantle and BF4. Payback is a beyotch!
|
|
#3
|
|||
|
|||
|
way to compare apples to cows.
__________________
0.o InorganicMatter-"A broken clock is right twice a day." |
|
#4
|
|||
|
|||
|
I understand that the Mantle problem is much worse, but I feel it's still a fair comparison.
|
|
#5
|
|||
|
|||
|
Meh...par for the course in the nVidia vs AMD pissing contest. Whomever makes the backroom deals and writes the checks first gets the advantage. Nothing new, sadly.
__________________
* i7-3770K * Asus Z77 Maximus V Gene * Prolimatech Megahalems Black + PK-3 + push/pull Enermax Vegas Trio 120mm PWM * 2x8GB Crucial Ballistix Sport DDR3-1600 1.35V * 256GB Crucial M4 + 480GB Seagate 600 + 5.25" to 3.5" Hot Swap Bay * SLI Asus GTX 780 DirectCU II OC * Asus VG248QE * Corsair 350D Windowed + Arctic Cooling 140mm PWM + Arctic Cooling 120mm PWM * Seasonic X-750 * Windows 7 Ultimate x64 SP1 Start Menu Edition |
|
#6
|
|||
|
|||
|
Nvidia using thier proprietary Gameworks closed libraries is nothing new, ESPECIALLY on bundled games (...Batman...etc...).
This is what nvidia does. They still need to be baseline compatible with DX, but they hide "their" library code, which they supplied to the developer, so they can dictate EXACTLY (and I mean EXACTLY) what the competition performance will be. Glad to see an article about it, but it has been this way for years...
__________________
Desktop #1: i7 920 @ 4.0ghz + Corsair H50 :: 12 GB Corsair Vengeance RAM :: 2x Asus DC2T R9 280X Crossfire :: Logitech G13 + G19 + G700 :: 1x Corsair Agility 60GB SSD, 1x WD Black 1.5TB, 2x Seagate 1.5TB (RAID 1) :: 1300w PSU Desktop #2 : Xeon E5620 @ 3.8ghz + Arctic Freezer 7 PRO :: 6 GB RAM :: 1 x 7950 Boost :: 1x Kingston 64GB SSD :: 1x WD Black 1TB Laptop : HP Pavilion dv6-6135DX AMD A8-3500M Crossfire w./ 6750M (now aka 7750M) :: 6GB RAM :: 240GB OCZ Agility 3 SSD :: Blu-Ray :: undervolted from 0.700v @ 600mhz to 1.25v @ 2.8ghz Laptop #2 : ASUS K55N AMD A-Series A8-4500M(1.90GHz) w./ AMD Radeon HD 7640G :: 8GB Memory :: 250GB Samsung 840 EVO SDD :: |
|
#7
|
|||
|
|||
|
AMD’s Mantle, a low-level API, doesn’t require the company’s GCN architecture to function properly. AMD says it will work equally well on Nvidia cards. The company clearly waves a banner of open-source development and ideals.
AMD isn't going as far as nvidia. |
|
#8
|
|||
|
|||
|
Zero performance gain between Radeon 280x and 290x is just unreal, though.
__________________
StinkyMojo ~ http://www.heatware.com/eval.php?id=32642 Intel i7 2600 @ 3.9GHz * MSI 290X Gaming * 8GB G.SKILL Sniper 1.25v Lite-On 256GB SSD * Samsung 1.5TB HD * SeaSonic M12II 520w * NZXT Vulcan LG 29UM65 29" UltraWide * Das Model S Ultimate (MX blues) * Logitech g400s |
|
#9
|
|||
|
|||
|
Quote:
Not gonna happen.
|
|
#10
|
|||
|
|||
|
Quote:
![]()
__________________
|| CPU: Core i7 3770K\4.5GHZ@1.240V Cooled by Corsair H100i || MB: ASUS Sabertooth Z77 || GPUs: Gigabyte R9 280X WF OC Rev2. Core:1150mhz@1.163v Mem:1650mhz(Actually in Use) EVGA GTX 780 SC ACX 1280mhz/7000mhz (working machine) || Monitor: BenQ XL2420T 120hz || RAM: 4x4GB Corsair Vengeance CL9 1600mhz || HDD: 1TB Western Digital Caviar Blue+ SSD: Samsung 840 PRO 250GB. || Case: Corsair Vengeance C70 "gunmetal black" edition || PSU: Corsair HX-850 || Logitech G510 with Aida64 LCD Custom Monitoring || Corsair M90 |
|
#11
|
|||
|
|||
|
Quote:
Not AMD's fault. AMD could offer the source code, specs, library and even PAY for Nvidia to integrate it, Nvidia would still say 'no' See the 'freeSynch' fiasco. |
|
#12
|
|||
|
|||
|
Quote:
I think you would have more success telling a wall that it's a door than getting anywhere with Prime1.
__________________
PC i7 3770K @ 4.5ghz 1.25v Asus Gene V G.Skill 2x4gb DDR3 1600 Gigabyte 7950 Windforce 3X and MSI Twin Frozr III 7950 Silverstone Gold Evolution 750w G.Skill Phoenix 60gb SSD | 2 x Plextor M5P 256gb in Raid 0 Samsung F1 Spinpoint 500gb Creative Fatal1ty Titanium Sound Card Samsung 23" LED and LG 23" LCD |
|
#13
|
|||
|
|||
|
What does Mantle have to do with any of this?
Unless DICE specifically denied optimization access to Nvidia during BF4's development, Mantle is a totally separate issue. GameWorks isn't an API.
__________________
4670k @ stock w/ Lapped TRUE 120 ASUS Maximus VI Hero GSkill Ripjaws X 2x4GB @ stock DDR3-1866, 8-9-9-24, 1.5v Sapphire 280X Vapor-X Tri-X @ stock 1100/1500, 1.2v Cooler Master V700 NZXT H440 Dell U2412M 1920x1200 60hz |
|
#14
|
|||
|
|||
|
Quote:
Mantle is an API. DirectX is an API. OpenGL is an API. Gameworks in not an API. What nvidia is doing is a taking an API (DirectX) and trying to turn it into their own "gameworks" API. I know why they're doing it, but I really see a problem with it. All we need is amd to do the same thing and we will have to change out gpus to play different game. You want to play game x, you need an nvidia card. You want to play game y, you need an amd card. Gameworks is bad for gamers, as a gamer if you don't see that, I hope you like changing out gpus all the time.
__________________
Gamer-PC Intel Core i7 4770K @ 4.2ghz, Corsair H100i MSI Z87-G45 Gskill Sniper 8GB 2133mhz MSI AMD R9 290, NZXT G10, Corsair H55 Samsung 840 EVO 250GB raid 0 Corsair HX650 Corsair Air 540 Windows 8.1 Pro with Media Center Dell 29 inch UltraWide U2913WM |
|
#15
|
|||
|
|||
|
I am fairly sure he is not testing on a WD optimized driver, so it makes a lot of this very likely moot. Not the first time we have seen coding held back from team Green or team Red. We might just be seeing some of this very very soon.
Quote:
__________________
Intel Core i7-4770K @ 4.6GHz - 1.31vCore ASUS ROG Maximus VI Extreme 16GB of Corsair Dominator DDR3 @ 1600MHz 9-9-9-24-1T NVIDIA GTX 980 SLI Antec KUHLER H2O 920 CPU Cooler 3 x Dell U2410 displays in Eyefinity- 3600x1920 4 x Corsair Force GS 480GB SATA 3 6Gb/s SSDs 3 x Western Digital 2TB Spinning Drives Silverstone Raven 3 Chassis Silverstone Strider Gold 1200W PSU |
|
#16
|
|||
|
|||
|
Do people really make their GPU purchase decisions based on one game? Sounds ridiculous.
__________________
MSI Z77A-G45 Gaming - Intel 2500K @ 4.3Ghz Xigmatek S1283 - 16GB G.Skill Sniper 1600Mhz MSI GTX970 Gaming SLI (1585mhz/8ghz) Modded BIOS LIAN LI PC-A70F Corsair AX850 PSU 3x Dell S2740L Surround (5760 x 1080) - 48" Sony Bravia Samsung 840 Pro - W8.1 - Samsung EVO 250GB - 1TB WD for Storage Razer Anansi / Logitech G602 / Ratpadz XT / G27 Wheel Creative GigaWorks T40 Series II / Logitech G930 Headphones Lenovo Y510p, Core i7, 8GB RAM, 250GB Samsung EVO 250GB SSD, GTX750M SLI |
|
#17
|
|||
|
|||
|
If nvidia has their way, you might have to.
__________________
Gamer-PC Intel Core i7 4770K @ 4.2ghz, Corsair H100i MSI Z87-G45 Gskill Sniper 8GB 2133mhz MSI AMD R9 290, NZXT G10, Corsair H55 Samsung 840 EVO 250GB raid 0 Corsair HX650 Corsair Air 540 Windows 8.1 Pro with Media Center Dell 29 inch UltraWide U2913WM |
|
#18
|
|||
|
|||
|
edit: In light of new information my rant seems a bit puerile. End of post. :/
__________________
case: CM Storm Stryker | mobo: MSI Z87-GD65 Gaming | cpu: i7-4770k @ 4.6 GHz (1.225v) - Corsair H100i | gpu: 2x AMD R9 290 CFX on Kryographics blocks, Aqualis res, NexXxos Monsta 360mm rad | ram: G.SKILL Ripjaws X Series 16GB 1866 | storage: Crucial M4 256GB SSD, Samsung 830 256GB SSD| psu: Corsair AX1200 | os: Win 8.1 | *** | PC displays: 27" Overlord Tempest X270OC Glossy 1440p@ 110Hz, 50" Sony KDL50W800B | sound: X-Fi Titanium HD>Objective2>AKG Q701 | kb: custom Ducky Shine II TKL (reds/o-rings) | mouse: SteelSeries Sensei Last edited by NukeDukem; 05-27-2014 at 04:49 AM. |
|
#19
|
|||
|
|||
|
Exactly my point. This is why Mantle is worse. Sleeping Dogs is DX11, so it will work on different systems. If it was Mantle it would be limited to only AMD.
Either way it looks like this was much ado about nothing. At least I got to use my Breaking Bad gif. |
|
#20
|
|||
|
|||
|
Quote:
Stupid proprietary DX11.
__________________
4670k @ stock w/ Lapped TRUE 120 ASUS Maximus VI Hero GSkill Ripjaws X 2x4GB @ stock DDR3-1866, 8-9-9-24, 1.5v Sapphire 280X Vapor-X Tri-X @ stock 1100/1500, 1.2v Cooler Master V700 NZXT H440 Dell U2412M 1920x1200 60hz |
![]() |
|
|