PS4 Will Out-Power Most PCs For Years To Come

Status
Not open for further replies.
Well, if you can successfully run Windows, x86 Linux, and x86 UNIX on the PS4, then I'll agree with you.
I'm not holding my breath for that though.

You shouldn't believe everything you read on the internet, as many consider smartphones utilizing x86 processors to be a PC, when they are no closer to being a PC than are the ARM-based smartphones; though the term "PC" is pretty anachronistic this day and age, and could *technically* be used for almost any device of this nature.


I'm still waiting on your answer on why the Jaguar APU would be more powerful than other modern x86 processors, btw. ;)

I have just read the above message from the Internet. Here is another message for someone to read:-

Firstly, you need to understand that powerful is not best. Efficiency is. Having an architecture which allows for a path of lesser resistance to be followed is more superior. With it, you can get there quicker, while using less power. With computer architecture, it is about shortening the internal code and data pathways, while achieving the same effect. It is the same as what they have been doing to the transistor logic within the GPU itself, over the years. It is the same as how evolution works. The same principle applies to market forces & development tools. Everything in existence follows the path of least resistance available. It is down to the rules of physics. Every body moves in a straight line, unless compelled by external forces to act otherwise. I say that a PC evolutionary branch-off to a dedicated “game engine” architecture (just like what happened with the GPU years ago, the computer before that, (AGP before PCE-E)) is going to lead to finding that perfect design.

(Which “other modern x86 processors”?)
 
You don't have to be so ashamed, mine is only an Intel i3 530 @ 4.4GHz, plus 8GB of CAS 8 DDR3, plus an ATI 5870. Still, it is much better than yours.:D

Actually, about that:

Your system:
GPU: HD5870 - Score 2560
CPU: Core i3 530 @ 4.4GHz - Score 3857 (I adjusted the original stock-speed score with your frequency)

My system:
GPU: GTX480 - Score 4291
CPU: Phenom II X4 965 @ 3.4GHz - Score 4332


So aside from my dual-channel DDR2, my system pretty much kicks your systems ass.
Eat it! :D
 
Firstly, you need to understand that powerful is not best. Efficiency is. Having an architecture which allows for a path of lesser resistance to be followed is more superior. With it, you can get there quicker, while using less power. With computer architecture, it is about shortening the internal code and data pathways, while achieving the same effect. It is the same as what they have been doing to the transistor logic within the GPU itself, over the years. It is the same as how evolution works. The same principle applies to market forces & development tools. Everything in existence follows the path of least resistance available. It is down to the rules of physics. Every body moves in a straight line, unless compelled by external forces to act otherwise. I say that a PC evolutionary branch-off to a dedicated “game engine” architecture (just like what happened with the GPU years ago, the computer before that, (AGP before PCE-E)) is going to lead to finding that perfect design.

Taken straight out of a hardware/programming and engineering text book, I love it!
While I do completely agree with this, if the CPU is lacking in processing power, no matter how efficient it is, it will be completely decimated by a high-end PC as there are no existing bottlenecks that exist within it, unlike as you proclaimed earlier.

A weak-ass, highly-efficient CPU is not going to win the PS4 the performance crown between it and PCs, however.
 
Taken straight out of a hardware/programming and engineering text book, I love it!
While I do completely agree with this, if the CPU is lacking in processing power, no matter how efficient it is, it will be completely decimated by a high-end PC as there are no existing bottlenecks that exist within it, unlike as you proclaimed earlier.

A weak-ass, highly-efficient CPU is not going to win the PS4 the performance crown between it and PCs, however.

I wrote the book.

The PS4 CPU is fine. Having 4 x 3.2GHz independent cores equivalent, with GPU assist on physics, along with the high bandwidth memory, is well balanced. More than can be said for most PCs.
 
Taken straight out of a hardware/programming and engineering text book, I love it!
While I do completely agree with this, if the CPU is lacking in processing power, no matter how efficient it is, it will be completely decimated by a high-end PC as there are no existing bottlenecks that exist within it, unlike as you proclaimed earlier.

A weak-ass, highly-efficient CPU is not going to win the PS4 the performance crown between it and PCs, however.

Lets focus on the CPU. What out there is better, and by how much?

Remember, the CPU here will be heavily involved in processing large streams of data, held within 8GB of high bandwidth memory. It doesn't need to run any faster than the data streams will allow.
 
Anyone who believes the PS4 will be faster than my computer at 1080p gaming,
I have a deal for you.
1) PM me and I will give you my Paypal address where you can send me $100 to $1000
dollars. If the PS4 is faster at 1080p games than my computer when it is released, I
will Paypal you back TEN TIMES the money you sent me.
2) If you are not willing to do option #1, which you think will give you ten times your
money, then you don't truly believe the PS4 will be faster than a PC right at its release,
yet alone 10 years later, so STFU.

Do you have any posts not in a text wall?
Read the damn original story then come post...

Maybe you will sound less like an idiot and be able to make a sound argument.
Maybe...
 
http://en.wikipedia.org/wiki/AMD_Accelerated_Processing_Unit

Jaguar
Jaguar will be the successor to Brazos 2.0.
The new core design will add support for SSE4.1, SSE4.2, AES, PCLMUL, AVX, BMI, F16C, and MOVBE instruction sets[56]
Memory address space is increased from 36 bits to 40 bits.
The floating point unit is considerably more powerful.
Core size of 3.1 square millimeters, down from 4.9 square millimeters.
 
The only way we can tell if the PS4 processor shows any promise is when Temash, Kabini, and Brazos 2.0 are released. All will be using a Jaguar compute unit in some form. It is known that Jaguar-based APUs will also be an SoC as well and will have a Radeon 8000M GPU on them of some kind.

Unfortunately, those ultrabooks, laptops and tablets using them won't be released until later this year.

Until then, we cannot gauge how fast Jaguar truly is. And, it'd be purely speculative to compare it to a current processor.

I think the one thing we can use is this: Brazos 2.0 is supposedly around 20% faster than Brazos 1.0 using the Jaguar cores. But, Brazos is a mobile processor and most likely running on a lower frequency.
 
I wrote the book.

The PS4 CPU is fine. Having 4 x 3.2GHz independent cores equivalent, with GPU assist on physics, along with the high bandwidth memory, is well balanced. More than can be said for most PCs.

Are you living in 2005???
All of your knowledge is around a decade old, and isn't necessarily wrong, just old and not really relevant.
Mentioning AGP, inefficient processors... it's like your knowledge is literally coming straight from the Pentium 4 Netburst and Athlon XP era, of which none is relevant any more, at all.

Most PCs can easily do these things you mentioned, and I think you are really underestimating what they are actually capable of.
No, a PC with an i3 and HD4000 won't be capable of it, but throw in a GTX660 and it will be able to do everything that the PS4 can easily, and far far more.


I would go further and argue that you have never owned a system that can run the next generation of “Game Engines” better.
I am going to throw these words right back at you, because if you are basing your experiences with modern games on that HD5870 of yours, boy have I got some news for you... that thing hasn't been relevant since the GTX200 series, which is very obsolete now.
Also, using your logic, you shouldn't even talk about PhysX, considering your GPU won't even natively allow for it, so until you get an NVIDIA GPU, just shush on that subject.


I also find it funny that my system kicks the shit out of your system (and I actually proved it), unlike what you incorrectly and arrogantly stated earlier, yet you fail to mention anything on that. :rolleyes:
You see, when I'm wrong, I man-up and admit it, as I did earlier in this thread; you should really consider doing the same, because not doing so isn't really helping your respect-O-meter right now. ;)
 
Taken straight out of a hardware/programming and engineering text book, I love it!
While I do completely agree with this, if the CPU is lacking in processing power, no matter how efficient it is, it will be completely decimated by a high-end PC as there are no existing bottlenecks that exist within it,

Woah my Linux friend. Probably one of the biggest bottlenecks that everyone carries around is that of Windows. Software makes huge differences between perceived power and what's actually there.

Right now I would be considered as having a woefully underpowered "bulldozer" machine. The thing is in Linux BD actually is pretty powerful. Yes the problems with the design are there, but not having a shitty OS holding it back can do unbelievable things.

Consoles don't have nearly the amount of bloat that power rigs have. In addition when you are talking about game development you MUST (and it really isn't an option) take the OS along with the need for the software to work on a wide variety of configurations. Consoles have no such requirement. Regardless of how anyone feels this is the reality. For developers to be able to focus on hardware that has more power than the min spec of their PC counterparts can mean all sorts of things that wouldn't necessarily be available to regular PC users.

The longevity of the PS3 and especially that of the XBOX360 kind of proves the point. Those consoles are easily 7 years old yet still can hang with and beat any regular machine you would buy from Best Buy.
 
What bottlenecks exist in modern high-end PCs?
WTF are you on about now???


That's your (base-less) opinion and it has yet to be proven.

The bottleneck on the 1600Mhz memory, which holds the gigabytes of game data, and the PCI-Express.

Capiche?
 
Another way to look at this.
The PS4 will be similar in size and power (cpu/gpu wise and Wattage wise) as the 17" gaming
laptops that will be out at the same time (around Holidays) made by Alienware or Cyberpower,
using intel quad cores and Nvidia graphics. That is a fair and valid point, the PS4 and a laptop both made around Christmas will be about the same computing/gpu power and wattage used. Any good gaming desktop will be four times the size, use three times the wattage (power), have an i7 and $400-$500 graphics card. It is not a fair comparison, the desktop will OBLITERATE both the laptop and PS4. Anyone care to disagree with this has not owned every game console in the last 20 years as well as 6 laptops and 10 desktops.

Let me put this in plain english for you.
No one... I repeat no one is saying that the PS4 will be the most powerful device on the market at launch or after launch.

But that is not what the quote said... The quote said MOST.... Most pcs are not gaming desktops or laptops. Here are most...

http://www.shopping.hp.com/en_US/home-office/-/products/Desktops/Desktops
http://www.dell.com/us/p/inspiron-660/pd.aspx

And that is being optimistic ... because Most are not going to be bought at the same time...Most are ~ 2yrs old.

So your range is from Intel IGP to a 7550. Most computers are going to get raped.
Not mine.... not yours MOST.

Now... Lets have a reading comprehension test. What key word did the person being quoting use?
 
Are you living in 2005???

No, a PC with an i3 and HD4000 won't be capable of it, but throw in a GTX660 and it will be able to do everything that the PS4 can easily, and far far more.


Agreed, but why throw in a GTX 660? I have a GTX 680 now, which will kill the PS4's
gpu. And by Christmas (PS4 time) I will have a Titan or Nvidia's next high end card.
These cards will play Crysis, Metro, Tomb Raider, etc. at 1080p with ALL the eye candy,
hi res textures, AA, SSAO, etc. at 60fps or over. The PS4's GPU CAN'T and WONT.
The PS4 will sacrifice AA, SSAO, and other image quality settings to obtain 30fps at true 1080p on demanding games or 60 fps on less demanding games. The whole argument is really about the GPU isn't it? A gtx 680 will even perform well in 1080p games with a 2-5 year old processor. I am not worried about the PS4's processor or ram, it will be fine, but there is no way in h@ll that the GPU will be up to the task of running ALL games at 1080p, 60fps, ultra settings. Period.
 
Let me put this in plain english for you.
No one... I repeat no one is saying that the PS4 will be the most powerful device on the market at launch or after launch.

But that is not what the quote said... The quote said MOST.... Most pcs are not gaming desktops or laptops. Here are most...

Why would I compare a GAMING CONSOLE (PS4) to a MOST used computer by
non-gamer grandmas? I compare the PS4 for gaming to a computer someone would
buy or build for gaming. Computer wins. End of story. Yet I still will own console for
exclusives.
 
Let me put this in plain english for you.
No one... I repeat no one is saying that the PS4 will be the most powerful device on the market at launch or after launch.

But that is not what the quote said... The quote said MOST.... Most pcs are not gaming desktops or laptops. Here are most...

http://www.shopping.hp.com/en_US/home-office/-/products/Desktops/Desktops
http://www.dell.com/us/p/inspiron-660/pd.aspx

And that is being optimistic ... because Most are not going to be bought at the same time...Most are ~ 2yrs old.

So your range is from Intel IGP to a 7550. Most computers are going to get raped.
Not mine.... not yours MOST.

Now... Lets have a reading comprehension test. What key word did the person being quoting use?

I'd ignore hotsocks, he's clearly a troll or a PC fanboy with a epeen complex and poor reading comprehension skills.
 
When I re-read the comments I really think he was focusing on the amount of RAM when he said it out-power PCs for years to come. In that regard, he is right.

What ever...RAM doesn't produce graphics.
 
The bottleneck on the 1600Mhz memory, which holds the gigabytes of game data, and the PCI-Express.

Capiche?

Um, no.
DDR3 running in dual-channel (128-bit) @ 1600MHz has a memory bandwidth of 25.6GB/s.
PCI-E 2.0 with 16 lanes has a data transfer rate of 16GB/s (8 up and 8 down).

So unless your game itself is over 50-100GB loaded in system memory (which no game is when running), I would say you are completely full of shit.
Even when running BL2 at the settings I provided to you, it barely uses over 1-2GB of system memory, ever.

So what are these mystical "gigabytes of game data loaded into memory" that you speak of??? :rolleyes:

Dude, you have no fucking clue what you are talking about. Typical fucking programmer.
 
I'd ignore hotsocks, he's clearly a troll or a PC fanboy with a epeen complex and poor reading comprehension skills.

Yup... he definitely failed that check... it was simple too... the word was in caps for goodness sake. I bet his grandmas basement is nice though.

Truly... despite it not breaking any world records... I really do want to see what the PS4 can do... not because I plan to buy one... I probably won't but because of Jaguar.
 
Another thing, if memory bandwidth were truly an issue with modern games, my system would be choking on it's old-ass dual-channel DDR2, which has a memory bandwidth of 12.6GB/s.
Yet, BL2 and many other titles can run at or damn near maximum settings at 1080p!

ZOMG, say it isn't so! :rolleyes:
So if my system can play these not-for-console games at near max settings with it's old DDR2 memory, then why are you saying that the system memory will be the bottleneck again???

Oh that's right, because you don't have a fucking clue what you are talking about! :rolleyes:
 
I'd ignore hotsocks, he's clearly a troll or a PC fanboy with a epeen complex and poor reading comprehension skills.

What don't you comprehend? We are not comparing MOST computers (in general),
we are comparing MOST computers (for gaming) to a console PS4 (for gaming). And
MOST gaming PCs will be faster than the PS4. Most 2 year old PCs in thrid world countries used for word processing and internet will not be faster than the PS4 for gaming, but they
will surf the internet and word process at the same speed. I am not trolling or a PC fanboy,
as I have built my PCs but also own PS3 and Xbox 360.
 
Let me put this in plain english for you.
No one... I repeat no one is saying that the PS4 will be the most powerful device on the market at launch or after launch.

But that is not what the quote said... The quote said MOST.... Most pcs are not gaming desktops or laptops. Here are most...

http://www.shopping.hp.com/en_US/home-office/-/products/Desktops/Desktops
http://www.dell.com/us/p/inspiron-660/pd.aspx

And that is being optimistic ... because Most are not going to be bought at the same time...Most are ~ 2yrs old.

So your range is from Intel IGP to a 7550. Most computers are going to get raped.
Not mine.... not yours MOST.

Now... Lets have a reading comprehension test. What key word did the person being quoting use?

That's pretty much what the article is.

Consider the fact that the dedicated gaming PC market makes up only a very small subset of the whole desktop/laptop PC market, the PS4 is definitely not going to out-perform a small section of that market. However, when you consider that the majority of the computers bought and sold today are not gaming PCs but are typically used for email, light gaming, social media, and productivity software, the PS4 will out-perform those PCs.

That's the point of the article.

And, until the majority of consumers start buying dedicated gaming PCs, the PS4 and the Xbox 720 will be considered pretty powerful compared to that Intel HD/weak mobility Radeon or Nvidia GPU-based PC/laptop one bought from Walmart, Best Buy, Office Depot or Staples for less than $500 or $600.

Your single-minded, everyday, Facebook/Twitter junkie consumer will think the PS4 and 720 will be the most powerful thing next to their business ultrabook, smartphone, tablet, and/or three-year old Intel GMA/HD-based computer. And, it's those that the PS4 will probably out-perform.

That's why the article says "most PCs" on the market today will not compare to the PS4 when it's out.

If you can run Crysis 3 at all medium settings (the closest I believe that'll come to the visuals of the Killzone PS4 demo on Jimmy Fallon's show) on an Intel HD 4000 at 1080p running at a nice 60 FPS, then by all means show me. Or, that Radeon Mobility 4250, or Nvidia 610M/620M/635M, or Nvidia GT 610/620/630/635, or Radeon HD 6550D, or Radeon HD 6250. (I think that covers some of the integrated and low-end dedicated GPUs on the market today.) If not, then the PS4 will out-perform most of the PCs and laptops currently sold and owned.

But, it will not out-perform that small percentage of dedicated gaming PCs and laptops you, myself or any other PC gamer own.
 
The bottleneck on the 1600Mhz memory, which holds the gigabytes of game data, and the PCI-Express.

Capiche?

One other thing I don't think you understand or realize, considering you keep mentioning the PCI-E bus.
Current GPUs, including the GeForce Titan, do not even fully utilize PCI-E 2.0 16x, yet you continue to say there is a bottleneck there... yet there isn't!

Also, not everything can go through the memory, the CPU and GPU still need to communicate directly.
The CPU and GPU of the AMD APU share the exact same bus, and they (not the game data) will be communicating via that bus, so quit acting like memory bandwidth is everything!

STFU and GTFO, noob. :p
 
Fair enough, but by Christmas you will probably be able to purchase a PS4 for $500
or a decent gaming pc for about that. The only difference from the $500-$600 pcs you
listed and a good gaming computer is the graphics card. By Christmas I am sure you
could get a more powerful graphics card than the PS4's cheap, as a 2 year old Nvidia
card will beat it and cost next to nothing to add to your base computer with intel 4000
graphics.
 
Are you living in 2005???
All of your knowledge is around a decade old, and isn't necessarily wrong, just old and not really relevant.
Mentioning AGP, inefficient processors... it's like your knowledge is literally coming straight from the Pentium 4 Netburst and Athlon XP era, of which none is relevant any more, at all.

Most PCs can easily do these things you mentioned, and I think you are really underestimating what they are actually capable of.
No, a PC with an i3 and HD4000 won't be capable of it, but throw in a GTX660 and it will be able to do everything that the PS4 can easily, and far far more.



I am going to throw these words right back at you, because if you are basing your experiences with modern games on that HD5870 of yours, boy have I got some news for you... that thing hasn't been relevant since the GTX200 series, which is very obsolete now.
Also, using your logic, you shouldn't even talk about PhysX, considering your GPU won't even natively allow for it, so until you get an NVIDIA GPU, just shush on that subject.


I also find it funny that my system kicks the shit out of your system (and I actually proved it), unlike what you incorrectly and arrogantly stated earlier, yet you fail to mention anything on that. :rolleyes:
You see, when I'm wrong, I man-up and admit it, as I did earlier in this thread; you should really consider doing the same, because not doing so isn't really helping your respect-O-meter right now. ;)

You can’t win your argument by distorting what I wrote.

Whether I have an i3 and HD4000 is not relevant to the success of the PS4. I am happy with the i3 and HD5870 because runs Skyrim at maximum and HD textures at 1080p. The next card up, worth getting, won’t, make much difference to such a DX9 game @ 60Hz (which actually partially proves my point).

You haven’t proved your system kicks any shit out of anything. Cherry picking other peoples benchmarks is not proof of your low memory bandwidth based system running a particular game engine. According to many of the benchmarks that I have seen, such as with 3dmark vantage, or games with more recent drivers, the HD5870 beats the GTX480 (significantly in many instances).
 
It's funny reading the comments on this post, most of us gamers don't realize "normal" people don't have gaming PC's. Everyone in my family, minus myself and a cousin who's also a gamer and builds custom rigs, has computers that are running hardware that's <2005'ish top of the line. They just don't have a need to be running top of the line stuff for the, at the most, flash games they play so they don't upgrade. I myself upgrade every year to two years now just because I like to be able to play games at max settings, I need the storage for all of the junk I have, or I get tired or not being able to ramp up my performance on current hardware.
 
You haven’t proved your system kicks any shit out of anything. Cherry picking other peoples benchmarks is not proof of your low memory bandwidth based system running a particular game engine. According to many of the benchmarks that I have seen, such as with 3dmark vantage, or games with more recent drivers, the HD5870 beats the GTX480 (significantly in many instances).

http://www.hwcompare.com/8783/geforce-gtx-480-vs-radeon-hd-5870/
Well, there you have it, the GTX480 beats the hell out of your HD5870 in everything except power efficiency.
The only way your processor beats mine is in single-threaded apps.

I am happy with the i3 and HD5870 because runs Skyrim at maximum and HD textures at 1080p. The next card up, worth getting, won’t, make much difference to such a DX9 game @ 60Hz (which actually partially proves my point).
Taking offense much?
Well I'm glad you are happy with it, but that doesn't mean my system still whoops your's ass! If you don't like it, then YOU shouldn't have fucking brought it up, DURRR! :rolleyes:


If my system truly were limited by my DDR2 memory, then I would not be getting 50-60fps in BL2 @ 1080p from the settings I shared with you earlier.
Damn you don't know anything about how this stuff works.
 
You forgot to subtract 3000 points from your score for having a junk OS. :p

LOL, well to be fair, I did have to install Win 7 since PlayOnLinux does not work with every single thing (BL2 for example).
It was a good run, but there are far too many licensed and proprietary pieces of software from Microsoft that require a license key and an actual working copy of Windows for many modern games to run properly.

For the games that did run though, they ran equally as well as they did natively in Windows, which is a milestone for WINE imo. ;)
 
guys this statement is really subjective. now were in [H] and MOST of us have pcs that will cream on the ps4 its different for the general population
 
DW-UK, you had better be looking at more than benchmarks to back what you are saying.
You have yet to give a reasonable argument or defense on system memory and PCI-E bottlenecks, of which you continually state that such bottlenecks exist in modern systems, which is incorrect.

Give me some proof, not synthetic benches, that system memory and PCI-E bandwidth is a bottleneck.
Real-world applications and games would be a good start, NOT synthetic benchmarks.


I understand that memory bandwidth is important, and especially so for HPC systems and VM farms, which can and do fully utilize memory bandwidth of DDR3; this is one of the bottlenecks on my system when doing intensive CPU loads outside and inside of a VM simultaneously, whereas on a similar system running DDR3, does not have the bottlenecks.
This is for memory intensive applications though, one thing that games are not, at least not when it comes to system memory, as proven by my DDR2 and BL2.
 
http://www.hwcompare.com/8783/geforce-gtx-480-vs-radeon-hd-5870/
Well, there you have it, the GTX480 beats the hell out of your HD5870 in everything except power efficiency.
The only way your processor beats mine is in single-threaded apps.


Taking offense much?
Well I'm glad you are happy with it, but that doesn't mean my system still whoops your's ass! If you don't like it, then YOU shouldn't have fucking brought it up, DURRR! :rolleyes:


If my system truly were limited by my DDR2 memory, then I would not be getting 50-60fps in BL2 @ 1080p from the settings I shared with you earlier.
Damn you don't know anything about how this stuff works.

It is easy to find a proper game benchmark where a HD5870 beats a GTX480.

GTX 480 Vs HD 5870 Crysis Showdown
http://www.bulletproofnerds.com/smf/index.php?topic=2119.0

Stick the game data in your DDR2 memory, along with an AMD CPU, then we might get something better.:p
 
I did have to install Win 7 since PlayOnLinux does not work with every single thing

It is unfortunate, especially with games that require an Internet connection and "cloud" capabilities.
For most stand-alone games though, it still works great.

It was fun beating STALKER: Shadow of Chernobyl on Linux. :D
 
It is easy to find a proper game benchmark where a HD5870 beats a GTX480.

GTX 480 Vs HD 5870 Crysis Showdown
http://www.bulletproofnerds.com/smf/index.php?topic=2119.0

Stick the game data in your DDR2 memory, along with an AMD CPU, then we might get something better.:p

Woo, one whole game.
It's funny how above, you mentioned that you didn't like me "cherry picking other people's benchmarks", yet here you have cherry picked another person's benchmark. :rolleyes:


You have yet to prove that system memory or the PCI-E bus is a bottleneck, so don't even fucking mention game data and DDR2 until I see some actual proof from you.
 
It is easy to find a proper game benchmark where a HD5870 beats a GTX480.

GTX 480 Vs HD 5870 Crysis Showdown
http://www.bulletproofnerds.com/smf/index.php?topic=2119.0

Stick the game data in your DDR2 memory, along with an AMD CPU, then we might get something better.:p

Another thing, the 5870 beat the 480 in average fps by a little over 1fps.
WOOPTY FUCKING DOOO!!!

Come on man, you aren't even trying. :rolleyes:
 
http://www.hwcompare.com/8783/geforce-gtx-480-vs-radeon-hd-5870/
Well, there you have it, the GTX480 beats the hell out of your HD5870 in everything except power efficiency.
The only way your processor beats mine is in single-threaded apps.

Damn you don't know anything about how this stuff works.

Those are not your benchmarks, and they were before the HD5870 drivers were significantly improved. Plus it says this at the end:-

“Please note that the above 'benchmarks' are all just theoretical - the results were calculated based on the card's specifications, and real-world performance may (and probably will) vary at least a bit.”
 
Another thing, the 5870 beat the 480 in average fps by a little over 1fps.
WOOPTY FUCKING DOOO!!!

Come on man, you aren't even trying. :rolleyes:

The Crysis benchmark is more important than any 177 FPS DX9 game.

Losing by 1 FPS is not "kicking the shit out of".
 
Woo, one whole game.
It's funny how above, you mentioned that you didn't like me "cherry picking other people's benchmarks", yet here you have cherry picked another person's benchmark. :rolleyes:


You have yet to prove that system memory or the PCI-E bus is a bottleneck, so don't even fucking mention game data and DDR2 until I see some actual proof from you.

I was proving that I could pick bigger cherries.
 
Those are not your benchmarks, and they were before the HD5870 drivers were significantly improved. Plus it says this at the end:-

“Please note that the above 'benchmarks' are all just theoretical - the results were calculated based on the card's specifications, and real-world performance may (and probably will) vary at least a bit.”

omg, those first four were game benchmarks taken from a 3rd party, aka Tom's hardware, and were not synthetic benchmarks, making them real, not theoretical. :rolleyes:

Dude, whatever, you're going to believe what you want to believe, and I didn't want to start a shit-fling fest over which of our GPUs is better than the other (btw, you started that).
I've asked you again and again for proof of truth for system memory and PCI-E bottlenecks that exist in modern systems, of which you have backed up your statement by providing nothing.

Therefore, this proves you don't know what you are talking about (yet again), and that this thread has derailed far enough; also I don't need any proof from you because I know that your statement is bullshit and is not a real-world issue, period.
I would keep going with the derailing, but I only do that to Microsoft and/or Windows 8 threads. :D
 
Status
Not open for further replies.
Back
Top