Why Blizzard is a threat to ATI/Nvidia's high-end segment and high-end PC gaming

There's still many exclusives on the PC that actually push the hardware, in many cases from sources you wouldn't expect. FPS just progressed and evolved faster on the PC, so original takes don't abound as much... It's hard to make that shooter that pushes the boundaries of tech and also has innovative gameplay. OTOH, that's exactly what you've got going in other areas of PC gaming.

A lot of RTS games are innovating on gameplay while being more demanding than any past games of the same genre. Games like Supreme Commander and CoH will really benefit from a nice rig... Not to mention the fact that I can't imagine playing them on a console even if they could technically handle it. So a lot of this comes down to personal preference... But it'll be a long time before ATI/NVidia find that they don't have much of a market because games simply don't push the hardware.
 
This thread should be renamed "Why high end hardware elitists are a threat to the future of PC gaming".

A software company can't cater to 2% of an audience and expect to stay in business. Especially when games cost $50+ million to produce.
 
In the late 90s and early 2000s we had all kinds of exclusive cutting edge action games for the PC.
Such as...? Make a list, rather than simply saying "we had all kinds of exclusive cutting edge action games", because you aren't really saying anything there.
 
I imagine he's alluding to the Quake/UT franchises, and later HL/CS, even though most of those ended up being ported to consoles in one way or another eventually... It's not like that is a one-way street either. I don't see what all the fuss is over. I know this isn't a console vs PCs thread, but to even insinuate that ATI/NV are in trouble because games will universally become less and less demanding is kinda dumb.

Even if you ignore PC gaming and assume it's dying, last I checked, they still release a new more powerful console every few years or so... In fact, some of these consoles have had shorter life spans than some of the old classics, the NES/SNES probably still had the longest lifespans of any consoles ever. There's always gonna be a demand for prettier, shinier, bigger, and more realistic. :)

If anything, what devs should learn to do is hedge their bets and not release games w/performance options that you simply can't use. Crysis actually played decently on a lot of hardware, just not at max detail, if they had just added the Very High settings on a latter patch, there probably would've been less complaining. /shrug

It's really not the first game to be built w/some future scaling in mind though, most of the old sims used to be like this before development on the genre slowed down...
 
I think there may be some truth to the OP's ideas. I for one haven't spent any money on anything pc related for several months because I don't have a need for it. And that reason being that I play WoW and watch anime on my pc. I know that I am just one person but I am sure there are many others like me that haven't spent any money on any pc hardware because of WoW and games like it. If that is indeed the case then YES WoW is hurting the hardware market. With that being said the real question people should be asking is HOW MUCH is it hurting the hardware market. I think that its not very much at all but any lost sale can be considered "hurting" the hardware market
 
I don't think there's any question WoW has slowed sales of other games, I think all MMORPGs (UO, EQ, etc.) have done this to an extent, and certainly as a whole... But this isn't a new trend.

I think that's a different phenomenon altogether from what the OP is trying to claim, games aren't necessarily becoming less demanding across the board because people don't wanna pony up for the hardware or because it takes away from resources devoted to gameplay during development. We've even started to see MMORPGs that are more demanding on hardware than ever before, see Age of Conan.

In fact, you could even look at it from an entirely different point altogether... GPUs have advanced so far in so little time that devs are having trouble catching up and using them to their full potential in many cases (Crysis aside), so adoption of the hardware lags while development of the software catches up. Doesn't mean either ceases or is set back by a monumental margin though.
 
I imagine he's alluding to the Quake/UT franchises, and later HL/CS, even though most of those ended up being ported to consoles in one way or another eventually...
Well, I did a little homework for him.

1998
Unreal
Half-Life
Powerslide

1999
Quake 3
Unreal Tournament
Freespace 2

2000
Giants: Citizen Kabuto
American McGee's Alice

2007
Enemy Territory: Quake Wars
Crysis
STALKER

Sticking only to graphically-intensive action games, it looks like 1999 was a pretty damn good year for hardcore PC enthusiasts. 1998 wasn't too shabby either (Powerslide and Unreal were nuts), though 2000 was a little depressing. Many-a high-end video cards were sold in 1999, no doubt. But, look also at 2007, where we have Quake Wars, Crysis (duh) and STALKER. So, are we really slowing down here...?
 
Freespace 2 was sweet, where have all the sci-fi light-sims gone! Giants was pretty amusing too, heh, I think that got ported to the PS2 eventually as well. If you go beyond the shooters and action games the list expands, particularly in latter years as those games catch up to the FPS and become more graphically intensive. The Witcher, SupCom, etc.
 
Late 1990's through the early 2000's.

Processors:

Janurary 1997 Pentium MMX 166 MHz
May 1997 - Pentium 2 233 MHz
April 1998 - Pentium 2 400 MHz comes out
March 2000 - Athlon 1 GHz
January 2002 - Athlon XP 2000+
September 2003 - Athlon 64

Graphics Cards:

February 1998 - Voodoo 2 ($230 cheap end, $300 high end)
...
October 2003 - Radeon 9800 XT ($500)

Let's not even get into prices in terms of regular computers. From PCWorld:

http://www.pcworld.com/article/3923/400mhz_pentium_iis_the_great_leap_forward.html

Here's another plus: Prices are lower than you think. PII-400 systems (with 64MB of SDRAM, AGP graphics cards with 4MB of SGRAM, and 17-inch monitors) start at $2750 for a Hewlett-Packard Vectra VL Series 8, a corporate model, and $2769 for Gateway's GP6-400, a small-business or home-office system.
...
Those PII-400 prices seem even more reasonable when you consider chip introductions of the recent past. When we reviewed the first PII-300 systems in October 1997, the three we tested cost $4080, $3799, and $3499.

Now, let's look within the past few years.

July 2006 - Core 2 Extreme X6800 (2933 MHz)
February 2008 - Core 2 Extreme QX9775 (3200 MHz)

November 2006 - Geforce 8800 GTX
June 2008 - Geforce 280

So, in the past year and a half, games have gained about best an increase of 20%-80% depending upon the game, whereas in the late 1990's and early 2000's, you were getting a 100% minimum increase every year.

It's no surprise that the PC took off those years, because you ended up into a constant cycle of upgrades, and games could take advantage of them. And at the same time, it should be no surprise that companies had to cater towards the low end, I-just-bought-this ex-top-of-the-line-computer-6-months-ago person to prevent pissing off their customers.

Technology has rapidly slowed down.

But to say that people will be happy with Blizzard or Valve? If they include DX 10.1 and it looks remarkably better, but slows the game down to a crawl, no. People will be pissed if they can't run a game at 200+ fps with every unreasonable option on.

But graphics aren't resolution. World of Warcraft and Warcraft 3 are horrible looking games quantitatively, but look great because of the quality of the art, and the art style of the game.
 
Honestly I would LOVE it if Blizzard would come out with a version 2 of the WoW engine. I would buy all new hardware so that I could run it. BUT they would need to have it work with the old servers at the same time allowing the people that chose not to "upgrade" to this version 2 engine to play with those that did upgrade. I don't know that this is even possible (that would be like having CS1.6 people playing with CSS people on the same server at the same time.) I think that if Blizzard COULD do this with WoW then you would see a huge jump in hardware buying for a month or two.

I LOVE WoW and have played it almost exclusively but I hate the graphics and the fact that I don't "Need" a high end system to play it LOL Yea I know it doesn't make any sense to me either :p

I do like your post Nytegard it has some interesting facts in there but I couldn't understand what you where trying to say. I'll read it again :)
 
I'm sure it's entirely possible for Blizzard to have two different WoW engines, it'd be complicated, but it's entirely possible. Other MMORPGs have run into that situation as they age and they decide to do a graphic overhaul with an expansion, ending up w/a severely upgraded graphics engine that has to play nice not only with the old client but with all the existing landscape/architecture within the game world (more so than the server infrastructure).

Look at some DAOC release screenshots and then some screenies from the latest expansions for instance, in many cases it doesn't even look like the same game, it all works transparently though. Sometimes they go back and update the textures/artworks for old zones independently of the engine/client/expansion you're running, etc.

I think Blizzard's gotten away without the need for such hassle by just focusing on the art direction of the game though. They've got no incentive to do something so drastic 'till the game starts to look really dated though, or 'till they have some real competition. Besides the fact that high-end graphics have never been a Blizzard specialty.
 
With the announcement of Diablo 3, the future is becoming more clear now. Let's face it, In 2 years the majority of PC gaming business will be going to Blizzard. What do WoW, Starcraft 2, and Diablo 3 have in common? They are/will be addictive as hell and have low system requirements.

Sure Blizzard might support dx 10.1 in d3 and sc2, some bells and whistles, but these games are still not going to push video cards very far, and they will still look and run great on low end hardware. If you can play WoW, you'll be able to play any other modern blizzard game.

With the majority of PC gamers using blizzard products, there will be even less of a chance for companies like Crytek to turn an (acceptable)profit with cutting edge graphics games. Not only is Blizzard a threat to ATI/Nvidia's high-end business, They are a threat to PC gaming as a platform for new, cutting edge graphics technology.

There are games like Sims and Spore that will also take business. But again, these are games that have low system requirements, so there's no incentive for PC gamers to get a high-end video card. This is the direction PC gaming is going, and ATI/Nvidia will be inevitably affected in the coming years.

This is just my current opinion and prediction, feel free to counter, but please be civil.

Once you got 80 Tempest with hella intersceptors fighting hundreds of zerg and other terran units (multiplayer) with all DX10 effects at 1920x1200, you'll really want to upgrade to a 4870/260 if ur still using an older card.

And I wouldn't say Diablo3 can run easily on a 8800gts just yet, the games barely done, it only seems easy to render because they don't keep the enemy corpses long enough, in d2 they last for minutes, I was dissapointed to see they fade away in less than 15 secs in d3, it's always fun to see loads of dead enemies on the screen at once.
 
If my 8800GTS G92 can run Crysis at med-high, it can run D3 maxed.at 1440x900

Once you got 80 Tempest with hella intersceptors fighting hundreds of zerg and other terran units (multiplayer) with all DX10 effects at 1920x1200, you'll really want to upgrade to a 4870/260 if ur still using an older card.

And I wouldn't say Diablo3 can run easily on a 8800gts just yet, the games barely done, it only seems easy to render because they don't keep the enemy corpses long enough, in d2 they last for minutes, K was dissapointed to see they fade away in less than 15 secs in d3, it's always fun to see loads of dead enemies on the screen at once.
 
Game is nowhere near complete, that's pure conjecture, for all you know it won't be out for another 2 years during which time it could go through any number of graphical upgrades. Anyone remember the first screenshots of StarCraft 1? :D
 
Honestly I would LOVE it if Blizzard would come out with a version 2 of the WoW engine. I would buy all new hardware so that I could run it. BUT they would need to have it work with the old servers at the same time allowing the people that chose not to "upgrade" to this version 2 engine to play with those that did upgrade. I don't know that this is even possible (that would be like having CS1.6 people playing with CSS people on the same server at the same time.) I think that if Blizzard COULD do this with WoW then you would see a huge jump in hardware buying for a month or two.

I LOVE WoW and have played it almost exclusively but I hate the graphics and the fact that I don't "Need" a high end system to play it LOL Yea I know it doesn't make any sense to me either :p

I do like your post Nytegard it has some interesting facts in there but I couldn't understand what you where trying to say. I'll read it again :)

Eh, just a history lesson.

What defines high end? What defines low end?

Let's take Half-Life, which came out on Nov. 1998

Minimum requirement is a Pentium 100 MHz which was released in March of 1994. So about 4 1/2 years ago. So, we're looking into about an Athlon 64 3400 an equivalant time span.

Crysis's minimum requirements:

Intel Pentium 4 2.8 GHz (3.2 GHz for Vista), Intel Core 2.0 GHz (2.2 GHz for Vista), AMD Athlon 2800+ (3200+ for Vista) or better

NVIDIA GeForce 6800 GT, ATI Radeon 9800 Pro (Radeon X800 Pro for Vista) or better, which actually exceed the timespan that Half Life had.

And if you take on additions such as Blue Shift, Counter-Strike, or Opposing Force, you're talking June 1995 for the technology needed, or compared to today, an Athlon FX-60 Dual Core system.

Also, the timespan that I wrote in the original was about the same amount of time that elapsed between the average life span of a console (which is roughly 5-6 years).

To get to the point, Crysis still has the same target as Half-Life, but the problem is that it also caters towards new hardware at the same time. If for some reason, Diablo 3 has an ability to make top end PC's cry with Uber graphics, it doesn't matter that people can still play Diablo 3, just not the way they want too.

Few people like purchasing a new computer every year. If you look at a computer purchase in the same way you look at a console purchase, one that lasts 5 years, it's no surprise that games released in the late 1990's and early 2000's had to cater towards "low end" computers, because technology was increasing at an incredibly rapid pace. So rapid, that low end back then would actually be closer to midrange if you time shifted it to today.

The difference being, that people felt the need to upgrade a decade ago, because technology was increasing at such a rapid pace. Windows booted faster. We went from a software based Quake 1 engine which couldn't hit 20 FPS to a fully 3D Quake 3 running at insane FPS. Video had to be incredibly compressed.

But today, computers from half decade ago can do everything we need that isn't games. And in terms of graphics upgrading, we've increased a couple pixel shader levels, which really isn't that obvious, because it can still be emulated just by making a few extra calls on older hardware.
 
damonposey,

Your original argument was that the proliferation of popular games with low-to-mid level system requirements will adversely affect ATI/Nvidia's high end business. I assume this is because you think such games will mean fewer games that push the graphic envelope, hence fewer people will buy high end cards.

First of all, the real money in graphics hardware isn't in high end. Not even close. The real money is in low end--integrated/bundled graphics. High end is there primarily for bragging rights, or company image if you will. How many people, even on a dedicated PC enthusiast site like [H], buy cutting edge graphics cards? A very small percentage compared to the entire graphics hardware pie. So even if there were a decrease in high end buyers, it wouldn't cause more than a ripple in A's/N's bottom line.

Then there's the issue of if high end buying would even drop off in the first place. I highly doubt it. As long as there's any, any, game that pushes the envelope, the hard-core gamers or e-peen-ers will buy cutting edge graphics cards. The type of people who buy high-end cards won't be swayed by the popularity of lower-spec games in terms of whether or not they buy top-of-the-line stuff.

What about console gaming, you say? Consoles have allegedly been the death of PC gaming for many years now, and look what's come of that. PC gaming is still very viable, and if studios can figure out a way to discourage piracy (an attractive downloadable content and online play model such as mentioned by Blizzard for D3 is a very good idea IMO) there's no reason why they can't continue to make money from PC gaming. And there are just some games that would never work well on a console--Medieval Total War, RTS games, etc.--and some that are better on a PC--RPGs with an active mod community like Oblivion.

And finally, we don't even know the system requirements on SC2 or D3. If Titan Quest running on my system is any indication, an 8800GT would be ok but a higher-end card would be better. I assume the same will be true of SC2 and D3--games that should be even more graphically intensive. So even there we have question marks.

My suggestion: let's just wait and see, shall we? Again, the sky isn't falling, and I'm pretty sure it won't be anytime soon.
 
I think piracy is just bs excuse for putting out a game that sucked in sp and mp. Bf2, 2142, hl2 engine online games and others are still doing fine and from personal experience bf2 is a great multiplayer game. At least the way these companies curb piracy is that they make it almost impossible to be able to play online with a pirated copy. Actually DICE/EA does not allow you to play bf2 online at all if you have some new folders/ files in the bf2 folder which it does not recognize. Steam updates are pretty much the same thing and you might as well buy a copy of cs source of tf2 if you want to play legitly online at all.
 
Most people dont even own such a small widescreen.

Whoa there. Most gamers/enthusiasts on this forum you mean? 19" widescreens are a very popular selection for new PCs, especially in new Dells. I can't count how many Dell machines I've set up for clients in the past few months that have came with 19" widescreens as standard.
 
Yea cause dell 22 or 24 in cost how much? I'm pretty sure dell 24in runs around $500.
 
I haven't seen a truly original concept game in ages, the space shooter sim, flight combat sim, classical RPG and some types of strategy games we saw in the '90s are all but dead today.

I miss X-Wing Vs. Tie Fighter. If there is a game even remotely similar to that style of play, I would buy it.
 
Well, the Freespace series (1&2) mentioned in this thread was probably the last great 'space-sim' game... Can't remember if there was any good ones after that or shortly before it... Though I do remember a revamp or re-release of XvsT w/semi-updated graphics (Freespace was still prettier from what I recall though), either way I imagine they'll all look dated today. Those games came out 5-6 years ago. Oh and that quote isn't really true...

Serious sims aren't dead (flying, racing, sub, or what have you), they just don't have the massive retail/ad presence they used to have when they were one of the biggest PC genres. A lot of serious sims are now simply sold online to curb costs, the audience for them doesn't really need to be awed by advertisements and PR anyway. Just because they're not retail products doesn't mean a lot of 'em aren't pretty high-end hardware-pushing games either.

You should see some of the multi-display rigs people on these boards have for that kinda game. MS' Flight Sim X was a notorious hardware hog when it was released. Even turn-based strategy games aren't really dead... Sins of a Solar Empire anyone? Civ Colonization? I've seen a bunch lately... Heck I saw a TV ad for Civ coming out on Xbox/PS, go figure.
 
Those of you wishing for the good ol' days of gaming might want to take notice of this:

Good Old Games

These people (members of CD Projekt) are taking old games (like Fallout 2, Freespace 2), updating them to run on Windows XP and Vista, and selling them online for $5.99 and $9.99 as downloads. Talk about games that don't need current hardware to run :)
 
With the announcement of Diablo 3, the future is becoming more clear now. Let's face it, In 2 years the majority of PC gaming business will be going to Blizzard. What do WoW, Starcraft 2, and Diablo 3 have in common? They are/will be addictive as hell and have low system requirements.

Sure Blizzard might support dx 10.1 in d3 and sc2, some bells and whistles, but these games are still not going to push video cards very far, and they will still look and run great on low end hardware. If you can play WoW, you'll be able to play any other modern blizzard game.

With the majority of PC gamers using blizzard products, there will be even less of a chance for companies like Crytek to turn an (acceptable)profit with cutting edge graphics games. Not only is Blizzard a threat to ATI/Nvidia's high-end business, They are a threat to PC gaming as a platform for new, cutting edge graphics technology.

There are games like Sims and Spore that will also take business. But again, these are games that have low system requirements, so there's no incentive for PC gamers to get a high-end video card. This is the direction PC gaming is going, and ATI/Nvidia will be inevitably affected in the coming years.

This is just my current opinion and prediction, feel free to counter, but please be civil.

I havent bothered to read all the pages in this post since I am work but I definatly do not agree with your reasoning. There are alot of people out there that dont even play Blizzard games so its a moot point in my book. I did not much like SC, was a C&C fan. Did not play Diablo for any length of time, just played thru the story and got bored just pointing and clicking and collecting more stuff. I admit, Blizzard makes good stories with it and the CGI sequences need to be made into some movie done by them. Wont play WoW so not much left.
I am still playing on my December 06 build with my 7800GTX card and am completely happy with all of my games that I play with no slow downs. Mostly a FPS/Sandbox fan myself. Been wanting to upgrade but not just because of games, because I like to upgrade and design new computers.
 
Been wanting to upgrade but not just because of games, because I like to upgrade and design new computers.

See that's pretty much how I feel. I like to upgrade too, because it's enjoyable, and I have the money. It's just, after that initial coolness factor wears off and I have things running stably, hanging out less in BIOS, it's like. There needs to be plenty of games for me to flex the computing muscle, or else all these great parts will be collecting dust. My most anticipated FPS game of this year is Left 4 dead, coming out in november, and my x1900xt that I bought years ago will be able to run it well. If it doesn't run it satisfactorily I'll upgrade probably, but still, its the source engine, its scalable.
 
Back
Top