[techradar] AMD on the PS4: We gave it the hardware Nvidia couldn't

I called it a few days ago. So glad Sony went with AMD. Nvidias is butt hurt, it's why they released that stupidily transparent PR.:D
 
Well he just stated the facts. Nvidia does not have the hardware Sony wanted. This also explains why AMD has been really working on the all in one solution.

Congratulations to AMD for a job well done in this case.
 
How does that Nvidia kool aid taste ? Bitter ? :cool:

This made me grin like
68024d1352029728-meme-yao-ming-espanol-cc_86521_yao_awwwwwww_yaaaaaaooooooo.jpg
 
You're clueless. Sony didn't go cheap. They will have, by far, the most powerful next gen console.

You don't seem very informed, the other consoles are AMD too...

The first consideration before anything else is price. If you look at the absolutely crap deal AMD got for the Xbox 360 (we can assume it's probably similar), it's unsurprising that other console makers would want that, and other GPU makers wouldn't want to "compete" with it...it's hard to razor a razor... :p
 
Nintendo has been with ATI/AMD since the Gamecube, so please don't assume they went AMD just because of the 360. If anything, it's Sony and MS following Nintendo's example as always.
 
How do you know that since we haven't seen what Microsoft will be offering.

Because Durango development kits are in the hands of hundreds of developers and there have been leaks. Digital Foundry, and Edge magazine have confirmed through multiple independent sources the Durango has a 12 compute unit GPU with about 1.2 TF vs the 18 cu GPU with about 1.8 TF in PS4. Not to mention PS4 has 8GB of GDDR5 @176MB/s vs 8GB of DDR3 in Durango. They both have 8 Jaguar cores. The PS4 GPU is 50% more powerful and the system has about twice the bandwidth.
 
Did the PS3 so much good.

PS3's advantage was only with CPU flops, and due to its arcane nature, difficult to utilize. It's well known X360 had a better GPU. The apples to apples GPU flops very slightly favored X360. But it was a much more efficient architecture (unified shaders).

This time around both systems are using essentially the exact same AMD GPU tech. Except Sony aimed for a higher specification. 18 cu vs 12 cu.
 
Which is why all the consoles have Blu ray cartridge drives. :p

Are you slow or just trolling?

Wii was first to introduce slot loading (not "cartridge drives" :rolleyes:) to consoles, and a proprietary Bluray optical disc is being used in Wii U due to costs involved in creating any entirely new storage type. Moreover, storage limitations in other media currently available is too severe and would inhibit ports from other consoles already using Bluray (HD-DVD not being an option as no one produces it anymore). The discs being used on Wii U are not Blurays but a proprietary optical disc based on them provided by Panasonic. FFS, I could've sworn this was [H] and not GameFaqs circa 1995 all over again.
 
Are you slow or just trolling?

Are you just ignorant of history?

Nintendo stuck around with cartridges longer than anyone else, surely if everyone was following Nintendo like they did with motion controls (hint: this is sarcasm) then all consoles would still be on cartridges?
 
If you want to discuss the N64, which introduced

Rumble Pak (which everyone else introduced afterwards)
Analog (which everyone else introduced afterwards)
4 player built in to the console
3D beyond that capable of any non-cartridge system of that era
No load times, as you may have forgotten how slow optical media was back then

Then there are other threads for that. You're grasping at straws with your selective memory, buddy.
 
If you want to discuss the N64, which introduced.

。    ()   。      ..
↑     ↑    ↑       ↑ ↑              
(the point) (Jupiter) (Uranus)   (pluto) (you)
 
。    ()   。      ..
↑     ↑    ↑       ↑ ↑              
(the point) (Jupiter) (Uranus)   (pluto) (you)

lol I'll admit I laughed.
 
Nintendo was also the first to use motion controls (Power glove anyone?) and it was way before some of you dudes were even born....
 
The cheap option?

Sounds like that may be accurate. If the rumors that it is an APU, as in integrated CPU and GPU on one die then that is correct. Having only one chip makes things cost less. Each chip raises the hardware price. However it also means that the overall performance can't be as high. You could not pack, say, a GTX 680 and a i7-2600 on one die. It would be too large, too hot, too prone to failure.

Hence AMD APUs are the cheap option. They cost less, and deliver less performance. That's all fine and well, but if people are taking what they said to mean "We delivered a performance level nVidia couldn't do," that is almost certainly false. We can see that in GPUs, nVidia has no problems performance wise.

My guess is it was the APU thing, combined with a willingness to take extremely low margins, that got AMD the contract. The APU made it cheaper and AMD probably took a very low margin deal, since they need the contract, and that was a price nVidia couldn't touch.
 
I never understood Nvidia fanboys... more money for the same outcome as AMD.... maxin out gamez
 
I never understood Nvidia fanboys... more money for the same outcome as AMD.... maxin out gamez

I wouldn't consider myself an NVIDIA fanboy, but my history is mostly with Nvidia cards. I decided to give the 5870 a try a couple of upgrades ago because it was a deal I couldn't pass up. Anyways, when the card ran great, it really was great. But there were issues that came up that I never ever had with an NVIDIA card. Brown screens with lines, grey screens with lines, troubles with dual monitors and flickers. These were things that never happened before I gave ATI a shot.

My last upgrade, I decided to go back to NVIDIA because it was also a deal I couldn't pass up. This 660ti is running flawlessly, no flickers, no strange brown/grey screens and driver crashes. Also, driver installation and use is just so much easier with Nvidia. Overclocks like a mofo also.

Some may say that my 5870 was faulty, but I did plenty of research online to try to figure out wtf was going on, I know that it seemed to be a common issue and complaint about ATI cards.

So, call me a fanboy if you want, but NVIDIA just seems more stable, reliable, and user friendly from my video card history.

Anyone with the newer ATI cards having any of these issues by any chance?
 
I wouldn't consider myself an NVIDIA fanboy, but my history is mostly with Nvidia cards. I decided to give the 5870 a try a couple of upgrades ago because it was a deal I couldn't pass up. Anyways, when the card ran great, it really was great. But there were issues that came up that I never ever had with an NVIDIA card. Brown screens with lines, grey screens with lines, troubles with dual monitors and flickers. These were things that never happened before I gave ATI a shot.

My last upgrade, I decided to go back to NVIDIA because it was also a deal I couldn't pass up. This 660ti is running flawlessly, no flickers, no strange brown/grey screens and driver crashes. Also, driver installation and use is just so much easier with Nvidia. Overclocks like a mofo also.

Some may say that my 5870 was faulty, but I did plenty of research online to try to figure out wtf was going on, I know that it seemed to be a common issue and complaint about ATI cards.

So, call me a fanboy if you want, but NVIDIA just seems more stable, reliable, and user friendly from my video card history.

Anyone with the newer ATI cards having any of these issues by any chance?


Well, I have used both and never had the issues described from AMD cards. Neither have I had issues with Nvidia, both max out games and AMD is cheaper. So if I had to choose between them, I would choose AMD for sure.
 
Sounds like that may be accurate. If the rumors that it is an APU, as in integrated CPU and GPU on one die then that is correct. Having only one chip makes things cost less. Each chip raises the hardware price. However it also means that the overall performance can't be as high. You could not pack, say, a GTX 680 and a i7-2600 on one die. It would be too large, too hot, too prone to failure.

Hence AMD APUs are the cheap option. They cost less, and deliver less performance. That's all fine and well, but if people are taking what they said to mean "We delivered a performance level nVidia couldn't do," that is almost certainly false. We can see that in GPUs, nVidia has no problems performance wise.

My guess is it was the APU thing, combined with a willingness to take extremely low margins, that got AMD the contract. The APU made it cheaper and AMD probably took a very low margin deal, since they need the contract, and that was a price nVidia couldn't touch.

Considering the only thing Nvidia has that would compete with AMD for consoles is Tegra, AMD does in fact have a faster product for less money.
 
Maybe nvidia tried offering their tegra chip to Sony.... And Sony said they were not looking to revive their Walkman hardware, but make a next gen console...
 
。    ()   。      ..
↑     ↑    ↑       ↑ ↑              
(the point) (Jupiter) (Uranus)   (pluto) (you)

This may be the funniest post I've ever read. Well done! :D
 
I never understood Nvidia fanboys... more money for the same outcome as AMD.... maxin out gamez

Shit I never understood fanboys in general.

We're talking about consoles here: manufacturing, initial loss per system and the amount of time/sales it takes to begin turning a profit are weighed as heavily (if not more than) the end product's ability to deliver their vision. These are big ass corporations with dimly-lit dungeons full of bean counters doing their very best with math skills just to earn another meal. They undoubtedly went with the cheaper option but the point is that it doesn't exclude the possibility that this thread title is correct.

It is an inflammatory comment that gets brand-loyal retards barking at each other, sure, but we all know that both companies have things the other does better in some way.
 
Each chip raises the hardware price. However it also means that the overall performance can't be as high. You could not pack, say, a GTX 680 and a i7-2600 on one die. It would be too large, too hot, too prone to failure.

Hence AMD APUs are the cheap option. They cost less, and deliver less performance.

You couldn't put discrete GTX 680 and i7 chips in there either. Waaaay too expensive, hot, and power demanding for a mass market priced console. The Crytek CEO of all people said the next gen PS4 has as much graphics hp as can realistically go into a console.

If Sony went with Nvidia GPU, it would have been close in performance to what AMD provided and they would still need to find a CPU. Nvidia calling Sony cheap is just a loaded word and not reflective of reality. I mean if an 18cu 1.8TF GPU is going cheap, what would they call Durango's puny 1.2TF effort ? An Xbox-U ?
 
And who cares? When it comes out it will still be years behind PC Hardware. Consoles = lose
 
Back
Top