Gaben confirms Source 2 engine

While I definitely agree with you, with consoles being programmed to metal, you can get probably get $1k PC performance from ~$750 in parts. While it still won't even be close to $750 in parts, you'll see outsized gains as compared to the equivalent PC performance.

Indeed. I just don't want anyone thinking they will be getting the equivalent of a $1k gaming PC for $300. The end result would dissapoint a lot of people.
 
Absolutely. Even the combination of bulk/warehouse pricing for the hardware and the significant programming efficiency gains, you won't get current $1k performance for $300-$500. But if you're looking at 1080p, you should be able to do some very nice 30fps single player performance.
 
I suppose that depends on how one defines "adequate". I don't think next-generation console ports (think UE4-powered games) are going to run adequately on today's $1k gaming PC at 2560x1600/2560x1400. It's only adequate today because of today's games, which are mostly undemanding.

Remember that console performance is often a function of developers being able to write very closely to metal. On the PC, we're often too far from metal to be able to squeeze out good performance.

Yes, it is more efficient from a development standpoint. We aren't talking about 100% gains, however.

I suppose I should clarify; If WhiteZero is talking about $1k off the shelf gaming PC's, then I would fully agree with him.
 
I suppose I should clarify; If WhiteZero is talking about $1k off the shelf gaming PC's, then I would fully agree with him.

$1k off the shelf might turn out to be more realistic in the end. Entirely possible.
But I was more thinking an average $1k custom build: i5-2500k, GTX 660 Ti type stuff.
 
I think you guys are looking too much at the numbers. Lets just face this:

When the Xbox360 first came out, and I loaded up my copy of Perfect Dark Zero, the visuals blew EVERYTHING ELSE out of the water. In 2005, most PC gamers were using 1600x1200 CRT monitors, Dual core CPUs were thousands of dollars, one gigabyte of DDR2 ram was over 150 dollars. GPUs had just enough ram to run maybe 4x MSAA. The best looking PC game in 2005 was a tie between Doom 3 and Half-Life 2, maybe Far Cry for some people. Things like normal mapping, UV distortion like parallax mapping, Post-process effects like HDR lighting were rare. Some games had one, maybe two of these effects, most had none. Unreal Tournament 3 and Crysis weren't even publicly announced, in fact, Unreal Engine 3 was just a tech demo... for Xbox 360.

This brings us back to the launch of the Xbox 360: not only could the Xbox perform all of the above effects, but it could do it with hundreds of characters on-screen, and run it at 30 FPS on a 1080i screen (which less than 5% of the western population owned at that time). It wasn't long before PC games like Crysis came out and totally re-defined what was possible for real-time rendering, and now 1080p LED screens are less than $150 at the local shops. But for a couple of years, PC gamers were looking with envy at what a little $500 box could do. sure, it's easy for us to look and say "hya hya! your silly console can't even run 1080p, or 60 fps, look at how bad your downloadable version of Crysis looks! Hya hya!" When really, Crysis was considered the gaming-rig killer all the way up until 2010, and for the most part, people had to turn the settings down to medium/high to even enjoy it on a 1k-2k machine.

So don't just ASSUME that the next gen consoles are going to blow JUST because the nearly 8-year-old ones do now.
 
Richochet 2.

Gaben.jpg

valve_logo_svg.png
 
Last edited:
Consoles are always going to have better graphics legs, because they allow devs to push the boundaries and actually code at the performance edge, rather than having to code for a million combinations of hardware. This has admittedly become a bit blurry with games having to be targeted at all consoles and pc simultaneously, so you get one 'master target' and then various substandard ports in many cases.

But console makers know that saving say $5 on memory which seems like such a stupid thing to do in hindsight means tens of millions of dollars, and had very little if any effect on game sales, since devs are always going to find a way to maximize console hardware. This doesn't happen on the pc, pc gamers are expected to upgrade to the latest and greatest.
 
I think you guys are looking too much at the numbers. Lets just face this:

When the Xbox360 first came out, and I loaded up my copy of Perfect Dark Zero, the visuals blew EVERYTHING ELSE out of the water. In 2005, most PC gamers were using 1600x1200 CRT monitors, Dual core CPUs were thousands of dollars, one gigabyte of DDR2 ram was over 150 dollars. GPUs had just enough ram to run maybe 4x MSAA. The best looking PC game in 2005 was a tie between Doom 3 and Half-Life 2, maybe Far Cry for some people. Things like normal mapping, UV distortion like parallax mapping, Post-process effects like HDR lighting were rare. Some games had one, maybe two of these effects, most had none. Unreal Tournament 3 and Crysis weren't even publicly announced, in fact, Unreal Engine 3 was just a tech demo... for Xbox 360.

This brings us back to the launch of the Xbox 360: not only could the Xbox perform all of the above effects, but it could do it with hundreds of characters on-screen, and run it at 30 FPS on a 1080i screen (which less than 5% of the western population owned at that time). It wasn't long before PC games like Crysis came out and totally re-defined what was possible for real-time rendering, and now 1080p LED screens are less than $150 at the local shops. But for a couple of years, PC gamers were looking with envy at what a little $500 box could do. sure, it's easy for us to look and say "hya hya! your silly console can't even run 1080p, or 60 fps, look at how bad your downloadable version of Crysis looks! Hya hya!" When really, Crysis was considered the gaming-rig killer all the way up until 2010, and for the most part, people had to turn the settings down to medium/high to even enjoy it on a 1k-2k machine.

So don't just ASSUME that the next gen consoles are going to blow JUST because the nearly 8-year-old ones do now.

they will blow, just like they do now, and always will. you fully contradict this point just by realising how far pc gaming has come, in spite of being held back for years and years like this. the concept of "re-defining real time rendering" on a console is laughable when you consider that the pc gaming market does this every time they update api standards, which is something that can't be done on unified hardware. can not.

envy devices with a singular purpose, that become obsolete in their entirety, by the time you could swap out minor components on any open platform? you must be joking. you're falling into the same trap naive fanboys stumble into thread after thread, trying to compare $1k multipurpose and infinitely more economical rigs to $500 worth of oem lock-in hardware.

this model is extinct, and there exists no such economies of scale or engineering that can bring you these features for free, how much money do people need to spend in order to realise this. you will always be making huge trade offs. always, there is no way around it. boggles the mind how even at this point, gamers who claim to understand how the market works have still not learned their lesson.
 
they will blow, just like they do now, and always will. you fully contradict this point just by realising how far pc gaming has come, in spite of being held back for years and years like this. the concept of "re-defining real time rendering" on a console is laughable when you consider that the pc gaming market does this every time they update api standards, which is something that can't be done on unified hardware. can not.

envy devices with a singular purpose, that become obsolete in their entirety, by the time you could swap out minor components on any open platform? you must be joking. you're falling into the same trap naive fanboys stumble into thread after thread, trying to compare $1k multipurpose and infinitely more economical rigs to $500 worth of oem lock-in hardware.

this model is extinct, and there exists no such economies of scale or engineering that can bring you these features for free, how much money do people need to spend in order to realise this. you will always be making huge trade offs. always, there is no way around it. boggles the mind how even at this point, gamers who claim to understand how the market works have still not learned their lesson.

The hell are you rambling on about?
 
they will blow, just like they do now, and always will. you fully contradict this point just by realising how far pc gaming has come, in spite of being held back for years and years like this. the concept of "re-defining real time rendering" on a console is laughable when you consider that the pc gaming market does this every time they update api standards, which is something that can't be done on unified hardware. can not.

envy devices with a singular purpose, that become obsolete in their entirety, by the time you could swap out minor components on any open platform? you must be joking. you're falling into the same trap naive fanboys stumble into thread after thread, trying to compare $1k multipurpose and infinitely more economical rigs to $500 worth of oem lock-in hardware.

this model is extinct, and there exists no such economies of scale or engineering that can bring you these features for free, how much money do people need to spend in order to realise this. you will always be making huge trade offs. always, there is no way around it. boggles the mind how even at this point, gamers who claim to understand how the market works have still not learned their lesson.

tumblr_mcokutf1In1rk48ieo2_400.jpg
 
Consoles are always going to have better graphics legs, because they allow devs to push the boundaries and actually code at the performance edge, rather than having to code for a million combinations of hardware.

You're not, that's why it's abstracted... and why if you're working on a giant AAA title, you probably literally have NV / AMD engineers coming out to work on shit for their hardware. :p

Overhead on PC side is getting smaller and smaller as drivers and API's evolve. DX10 was an enormous step in the right direction alongside changes to the driver model, etc. Some people in this thread make it sound like there's an utterly insurmountable overhead or something that is always there, looming, and tanking framerates.

The end all is -- PC is another platform, if you make a shit port, it's going to... be shit. This even happens from 360 to PS3 ports... there's moments where COD on the PS3 is like 25 fps behind the 360 version (I think MW3 finally brought them closer to performance parity). Or just try playing some Valve games on console. I played L4D on console once and it often chugged at times, especially multiplayer... meanwhile I also played it maxed out on like an x1900 and AMD x2 perfectly smooth. Source is obviously rooted in and has the most work done on the PC side.

You can surely get away with a bit more on consoles, but the gap is narrowing and likely will continue. W8 is a faster OS than it's previous incarnation, DX11.1 will bring potential goodies as well. PC scalability is even more ridiculous now than it was when the current gen consoles launched. I expect them to be quite fast for the price point in comparison, and probably more than a nice match against a solid single card machine... but it ends there.

It's pretty interesting that the next gen consoles might have X86 chips too, and if they're all running some kind of DX11 subset... then well, ports might be a lot simpler in the future...
 
I have to admit...

Half Life 1 was the main reason I bought my first PC. It was a damn good reason too.

Half Life 2 was neat. Steam was better than HL2. Need I say more?

Half Life 3 allgodsbedamned I don't even want to think about the possible let down incoming. Valve has been goin down the Valve for awile imho. Steam was better than HL2. Need I say more? ok BF3 kicks the shit out of whatever Gabe has going with Source 2 imho. Richocet 2 stfu Gabe no one likes your lameass jokes.
 
Most people forget just how graphically intensive source/hl2 was back then, it was the best looking game for it's time(IMO). I remember buying the ATI 9800pro just to play it, even though it was not the best card on the market it served me well.

BTW isn't Valve coming out with their own console?
 
Team Fortress: Play
Team Fortress 2: "Free"-2-Play
Team Fortress 3: Pay-You-2-Play
:eek:

c1ace759b5d44f233bf8dd298d3b2606
 
The hell are you rambling on about?
That the argument is mostly pointless to anyone not interested in blowing steam up each other's asses. At least, that's my take.

The news itself is probably not surprising to anyone who doesn't really give a shit about valve... it's obvious their current engine of choice is out-of-date and they are probably looking into something for the next generation. The point is they're known for not talking about such things until they feel confident that they'll release something. So, in the end it's just a "we're not dead, remember us!" kinda thing... and yeah there's a good chance they did that just to grab more money, but I don't see the harm in hoping they are actually close to releasing a game. People need hope, not to become obsessed with it though, but a positive attitude does go a long way towards leading a happy life. Oh look, baby rabbits and deer are outside... cute... brb
 
CryEngine 3 and Frostbite 2 are example of engines that are design to run on consoles but still pack additional features that could be utilize on PC. I'm sure Epic could have done the same had the bothered to utilize their own engine for the PC. Point is its not impossible.

So on the technical side, I'm not concern about how capable Source 2 would be just because its develop to be cross platform.

How HL3 will turn out is a completely different discussion altogether.
 
now you're going to pretend you don't speak english all of a sudden, when someone calls out a post you feel like you're supposed to agree with.

No, I'm saying your post was a rambling incoherent mess.

Fact is, in the real world consoles have a bigger market than PC gaming. Does that mean PC is dead or dieing? Of course not. But any developer who wants to make money is going to put a lot of focus on consoles.

From what I gather your post was a diatribe about the superiority of PC over console. And technically you are correct. But consumers love consoles. So they'll continue to be the majority platform for gaming.
 
Last edited:
Some people in this thread make it sound like there's an utterly insurmountable overhead or something that is always there, looming, and tanking framerates.
There's still a substantial amount of overhead, and it isn't in any way insignificant. I'm hopeful that the impending next-generation consoles are going to force graphics vendors to unify on a memory model and on instructions such that developers can simply write code on top of GPUs. It's something many developers have expressed considerable interest in already. If there were no significant benefit to it, no one would have expressed any interest in it.

Graphics APIs are absolutely what you want for 97% of games out there, but the 3% of games that could benefit from being written much closer to metal are likely going to have the highest budgets and are going to be competing at the highest level. It'd be a great thing for the PC gaming industry if developers who want to shed layers of abstraction and close the distance between code and silicon have the capability to do that.
 
Next gen consoles are going to have low/mid range AMD cpu with integrated graphics ... they'll barely be able to do 1080p @30 fps, let alone compete with PC. The only reason they'll be even able to do 30 FPS at that reso is because of low overhead on the API's.
 
Next gen consoles are going to have low/mid range AMD cpu with integrated graphics ... they'll barely be able to do 1080p @30 fps, let alone compete with PC. The only reason they'll be even able to do 30 FPS at that reso is because of low overhead on the API's.

I assume you're reffering to the "leaked" info on the PS4 devkits, which have AMD’s A10 APU's? Pretty sure those will have dedicated graphics as well.
 
the source engine is amazing. i dont really care what people say. dota 2 looks fantastic for an engine this old. is it making my 670 work its hardest? naw dude not even a little. but why do i give a shit if it looks good it looks good.

some of the details on dota 2 are amazing. i love the way it looks all around.

another thing about source engine is its extremely efficient. even when my computer was super shitty source games ran well and looked great. thats what a good engine does.

source engine 2 will be beautiful im sure.
 
There's still a substantial amount of overhead, and it isn't in any way insignificant. I'm hopeful that the impending next-generation consoles are going to force graphics vendors to unify on a memory model and on instructions such that developers can simply write code on top of GPUs. It's something many developers have expressed considerable interest in already. If there were no significant benefit to it, no one would have expressed any interest in it.

Graphics APIs are absolutely what you want for 97% of games out there, but the 3% of games that could benefit from being written much closer to metal are likely going to have the highest budgets and are going to be competing at the highest level. It'd be a great thing for the PC gaming industry if developers who want to shed layers of abstraction and close the distance between code and silicon have the capability to do that.

Afaik, the 360 also hits D3D and has some sort of abstraction layer. Supposedly you can work around it with low level hackery, but MS apparently doesn't like you resorting to that kind of stuff. I guess they're worried about stuff breaking if they ever had to change something hardware wise. Here's something interesting which seems to confirm that alongside some other interesting bits (hint: search draw call): http://www.humus.name/Articles/Persson_CreatingVastGameWorlds.pdf

In their case, it just straight up wasn't a bottleneck for them, even despite the scary numbers. Alongside the driver / API improvements, there are even more tools at your disposal now to reduce it, like the revamped instancing which DICE used to apparently great effect: http://dice.se/wp-content/uploads/GDC11_DX11inBF3_Public.pdf CPU's are only getting nastier and the overhead will continue to get lower. The sooner developers start cutting out the legacy bullshit and take advantage of the modern solutions, the better.
 
Yeah, there are means to bypass layers, and often times bypassing those layers is key to optimization (how do you optimize against draw call overhead? Well, don't make them! Batch; instance; use some other technique to get away from making another call). It only gets worse when you're dealing with multiple g-buffers and trying to accumulate all of that into something sensible that the API can actually deal with. The various GPGPU langages are available to get around certain obvious restrictions imposed by the API, and they've been used for some interesting things, albeit somewhat sparingly, but they're still suboptimal from a language and implementation perspective. Haven't done a whole lot of research on any of them, but I'm sure it's at least feasible to render entire scenes with DirectCompute and just pass it to Direct3D for drawing. It's possible, but it would also be absolutely torturous.

One issue Carmack brings up occasionally is that there's no standard way to interact with GPU memory. On the consoles, you can get pointers to it: if you want to put something in memory there, you just put it there. On the PC side, you have to engage in a lot of locking and unlocking, and in between that, you're pushing stuff into the API black box and you just cross your fingers that it's going to do it right. It's that way for a reason, but if there were a standardized means for direct GPU memory access, it wouldn't be necessary. You can get away from the API and just stick what you want to stick in memory. It's obviously there, so just let developers stick stuff in it.

I think the only reason this hasn't happened yet is because AMD and NVIDIA don't currently feel threatened enough to want to come together for the betterment of the industry they both participate in. I'd like for the next-gen consoles to change that, but with AMD supplying all of the console muscle this round, I don't think they'd be willing participants in any kind of cooperative effort to standardize on what needs to be standardized to make the no-API option a feasible thing.
 
the source engine is amazing. i dont really care what people say. dota 2 looks fantastic for an engine this old. is it making my 670 work its hardest? naw dude not even a little. but why do i give a shit if it looks good it looks good.

some of the details on dota 2 are amazing. i love the way it looks all around.

another thing about source engine is its extremely efficient. even when my computer was super shitty source games ran well and looked great. thats what a good engine does.

source engine 2 will be beautiful im sure.

Portal 2 looked pretty damn good too IMO. Some folks lament the lack of shaders, but I think that's a stylistic choice. If Valve wanted to use a ton of shaders, they'd code it into Source. Personally, I like the fact that I'm not stuck in some shader/blur/bloom hell like in some UE3 games.
 
Portal 2 looked pretty damn good too IMO. Some folks lament the lack of shaders, but I think that's a stylistic choice. If Valve wanted to use a ton of shaders, they'd code it into Source. Personally, I like the fact that I'm not stuck in some shader/blur/bloom hell like in some UE3 games.

yeah i agree, i think they timed the new engine pretty well. and im excited to see what is gonna come with it.. although im sure it will be half life 3
 
No, I'm saying your post was a rambling incoherent mess.

Fact is, in the real world consoles have a bigger market than PC gaming. Does that mean PC is dead or dieing? Of course not. But any developer who wants to make money is going to put a lot of focus on consoles.

From what I gather your post was a diatribe about the superiority of PC over console. And technically you are correct. But consumers love consoles. So they'll continue to be the majority platform for gaming.

that's an interesting pov (not really) when you've been rambling about consoles for no apparent reason, in a thread about pc gaming devs. is it because of orangebox? not sure why that would be evidence of any planning for closed platforms whatsoever. they might not even be writing any new code for windows anymore by the time this is out.
 
that's an interesting pov (not really) when you've been rambling about consoles for no apparent reason
I've been providing historical information in context with the discussion at hand. Perhaps you should go back and try reading?

in a thread about pc gaming devs.
Valve aren't exclusively PC gaming devs. And this thread isn't just about PC gaming. That is why I posted here and not the PC Gaming subforum.

is it because of orangebox? not sure why that would be evidence of any planning for closed platforms whatsoever. they might not even be writing any new code for windows anymore by the time this is out.
Entirely possible. But thats mostly likely just wishful thinking from someone with the elitist "PC gaming master race" attitude. Valve aren't fools and realize that their new engine would need to be multi-platform capable. Especially since they'll surely want to license out the engine, as they did with Source. Valve also won't write the engine exclusively for Linux (I assume you were implying) because, again, they are fully aware of their market (Steam Hardware Survey anyone?).

I'm not a huge fan of consoles either. But I'm not so delusional as to think they are irrelevant or going away any time soon. You're in denial if you think otherwise.
 
Last edited:
wonderfield said:
Quote:

Originally Posted by socK

Some people in this thread make it sound like there's an utterly insurmountable overhead or something that is always there, looming, and tanking framerates.

There's still a substantial amount of overhead, and it isn't in any way insignificant. I'm hopeful that the impending next-generation consoles are going to force graphics vendors to unify on a memory model and on instructions such that developers can simply write code on top of GPUs. It's something many developers have expressed considerable interest in already. If there were no significant benefit to it, no one would have expressed any interest in it.

Graphics APIs are absolutely what you want for 97% of games out there, but the 3% of games that could benefit from being written much closer to metal are likely going to have the highest budgets and are going to be competing at the highest level. It'd be a great thing for the PC industry if developers who want to shed layers of abstraction and close the distance between code and silicon have the capability to do that.




Posted from Hardforum.com App for Android
 
Fucking misclick post

One of the things dx10 lets you do is submit an array of resources like vertex buffers or render targets and set them in a single function call. Just one of the many ways they've worked to help you speed things up.

Shared memory sounds nice but you would probably have to partially reinvent the wheel for some things. Like when you lock a resource you might not get the same pointer twice, the driver may just give you a new chunk of memory to avoid a stall. Such is the cost of power, I guess.


Posted from Hardforum.com App for Android
 
I was wondering where the console vs computer thread went.....


Also, who didn't see this coming. I think everyone expected a new engine to come out especially for HL3

I agree that the Source engine has carried very well over the last decade, but there just seems a point where you have to start from scratch, or at least heavily mod the code base.
 
I agree that the Source engine has carried very well over the last decade, but there just seems a point where you have to start from scratch, or at least heavily mod the code base.

Chances are they didn't really "start from scratch" with Source 2. Game engine evolution is typically built on top of existing engines with significant improvements.
All the updates we've seen for Source over the past 8 years have been smaller, incremental improvements. Source 2 will still be Source, but much larger changes. Just like every new version of Crytek or Unreal is built upon the previous version.
 
Chances are they didn't really "start from scratch" with Source 2. Game engine evolution is typically built on top of existing engines with significant improvements.
All the updates we've seen for Source over the past 8 years have been smaller, incremental improvements. Source 2 will still be Source, but much larger changes. Just like every new version of Crytek or Unreal is built upon the previous version.

this.


935085_20080804_screen001.jpg

1175789623.jpg




just do give you an idea that is Frostbite revision 1. thats bad company 1

now look at a bf3 screen shot.

same engine different revision/version.

they just make adjustments and big upgrades in the engine it will just be source 2.0 just like any program.
 
Chances are they didn't really "start from scratch" with Source 2. Game engine evolution is typically built on top of existing engines with significant improvements.
All the updates we've seen for Source over the past 8 years have been smaller, incremental improvements. Source 2 will still be Source, but much larger changes. Just like every new version of Crytek or Unreal is built upon the previous version.

While that's likely, I certainly hope they threw out their lighting model and started from scratch. It's more or less the same precompiled lighting from Quake and HL1, with little to no flexibility when it comes to dynamic lighting and shadowing. It's just not going to fly in 2012+, both with gamers and level designers.
 
Back
Top