Thief Video Card Performance Preview @ [H]

Nixxes did the port, you can blame Square-Enix for the game itself.
 
The last AC game we evaluated was AC3 in 2012. It was pretty much a waste of time even then, and yes, is still with Black Flag.

Ghosts was also a turd, and not worth our time. Looking back in my emails I informed Kyle about NFS Rivals and COD Ghost on 11/19 both games were not worth it. We had more important things to do.

Believe me, I want a game we can add to our gaming suite, or replace an old game with, but none of those games listed are it, sorry.

I should also add that NFS Rivals has a hard coded 30 fps framerate limit and AC IV has a hard coded 62 fps framerate limit. What's the point of benchmarking these games? I am not sure why Brent is being griefed about this, because it's obvious that these games should not be benchmarked. Especially NFS Rivals, that game is such a joke - the physics engine in that game is also tied to the framerate limit. So if you remove the FPS limit, the game breaks. So dumb.

On the other hand, AC IV is a fucking OUTSTANDING game. It is seriously one of the best releases of 2013 and the graphics were VERY VERY good IMO. But it still is not the best candidate for benchmarking - due to the framerate limit. Not sure if that has been removed as of late in a recent patch (been a while since I fired it up), but every AC game has been like that. So it's not a good benchmark. It is definitely a GREAT game, though.
 
Nixxes did the port, you can blame Square-Enix for the game itself.

I don't care who gets the blame. I was just mentioning that Nixxes/EIDOS montreal has a solid track record, and Square Enix is the publisher right? I think don't publishers have complete control over gameplay mechanics, generally speaking the developer does all of that stuff - and the developer was Eidos Montreal. So it's not like Square enix takes the hit on this. Square Enix literally takes what Nixxes/Eidos gives them and sells it. Square does not code the game. Publisher versus developer. Two different things. If anything, Eidos Montreal shoulders much of the blame. NOT square. Square literally, as mentioned, takes what is given to them and develops marketing and packaging. And that was given to them by Eidos, with Nixxes doing the PC port.

Anyway, I don't want to get into an argument about who shoulders the blame, I have zero interest in that. All in all NONE of this matters, it's like arguing semantics. Who the fuck cares whether it's Square Enix or Eidos/Nixxes. ;) It's just not a compelling game compared to the prior Thief games. I think most people who enjoyed the prior games way back in the day would agree - it just defies everything that made the prior games so, so good. As a fan of the franchise I was just disappointed. How'd you feel about it? Did you play the prior Thief games back in the day? Deadly Shadows is still one of my favorite games of all time. I *loved* that game. Going from that to this....ugh.
 
Last edited:
i mispoke, yes, i meant Eidos.

Well its kind of like blaming the UPS guy for recieveing crappy hardware. Its not his fault, he just brought you the stuff.

I was just pointing out Nixxes didnt have a hand in the game mechanics.
 
Honestly, the game still looks like it has textures straight from the year 2006. The game does not have impressive graphics, although MP is semi mindless fun at times. Keep in mind COD: ghosts uses a modified Quake 3 engine. Yes, you heard correct. Modified quake 3 engine. And the textures are AWFUL up close despite the tessellation and all of that shit. Tessellation doesn't magically transform awful low resolution textures into something they aren't - the assets in COD: Ghosts just aren't up to par with modern AAA titles.

So I completely agree with Brent in this respect. It's just not a game worth benchmarking IMO. That said, i've had a bit of stupid fun in MP. But it certainly is not an amazing game, and the graphics are really not good - particularly the textures.

So does Thief. Which is odd that there is a performance preview for a game on such an old engine. Is it not obvious that this game isn't taxing at all?
 
Pretty sure Thief uses UE3.

That said, developers need to stop using it..it is starting to look really dated. UE4 has been out for...a while now. Early 2012?
 
So does Thief. Which is odd that there is a performance preview for a game on such an old engine. Is it not obvious that this game isn't taxing at all?

Er....no. You bolded the Quake 3 engine to imply that Thief 4 uses it. That isn't true. The Thief reboot uses the Unreal Engine 3 from Epic. Which is a better engine than what COD uses; While thief doesn't look spectacular it does look better than BO2 or Ghosts. Also, development of the Unreal engine is an ongoing project, it isn't a "set" point of reference, Epic has continually updated UE3 over the years to add additional features. That certainly doesn't mean that it's on the same level as Cryengine 3 or Frostbite 3 in BF4, but I will point out that UE3 has several games with excellent graphics - I do think that both Batman: AC and AO have very, very good graphics. Certainly far better than either BO2 or Ghosts, by a mile.

So this is all the more telling as to why the COD franchise in general needs a complete overhaul in terms of the engine used; while I can overlook graphics, the poor graphics in the COD franchise are just rather egregious. Which is understanding given the use of a modified Q3 engine with low resolution textures. I have conflicting feelings on this, I loved BO2 and think COD: Ghosts is rather meh. Either way I feel like a graphical overhaul should be done for COD in general next year. We'll see I guess, I suspect Activision will want the Q3 engine to be used. Yet again. Seems like Activision will milk the COD franchise with the same old engine which is rather sad, really.
 
Last edited:
We assessed each game and determined it wasn't worth our time. I am only glad I was able to get my money back on NFS Rivals.

I nominate this reply for your "Best response to a Strange Question" category thus far in 2014! Indeed, along the lines of "When a tree falls in the woods, does it make a sound?", we could ask the following, "When another lack-luster, cookie-cutter game is released, does anyone play?"....:D
 
After the long wait for Mantle to come to Battlefield 4 we would have thought they would have got it right with the next following game to showcase the technology, Thief. There is no question that TrueAudio and Mantle need all the showcasing these can get to prove the technology. At least CrossFire and Eyefinity work at game launch.

I have no doubt you recall GLIDE 3.x as well as anyone, Brent. What really sticks in my mind from that time frame is how it took D3d until version 7.x to achieve a reasonable parity with GLIDE 3.x. There was actually a fairly long wait between the advent of D3d and the time at which D3d(7.x) became more or less as useful and as graphically effective as GLIDE. Longer still after that before D3d clearly became the superior 3d API for Windows, and at that point even 3dfx moved to D3d officially.

This is not the same situation, I know. And I agree that it will be beneficial both to AMD and the game developers to get their Mantle stuff done ASAP. But I'm not really worried about the time factor as Mantle is brand new and to get it right will take some time. I'd worry, frankly, if it went too quickly and smoothly because then I'd likely lower my expectations.

The biggest factor for me, though, is TrueAudio--and of course I won't argue with any performance increases--and/or special FX, if these should find their way into Mantle at some point. These days I'm all ears--uh, phones...;) The thought of actually getting a fully responsive up-to-7.x channel (decided per developer, per game) sound stage with standard stereo phones blows me away! That's quite a big plus for an added API feature, imo, especially because it is based in the gpu hardware. If I hadn't listened to this demo of the technology:

https://www.youtube.com/watch?v=nKnhcsRTNME

I probably would think much of the publicity about it had been hyped. This is an amazing demo to listen to with plain old stereo phones. All of sudden, all of those gimmicky "5.1/7.1 Headphone" sets with crazy price tags and dubious results are a thing of the past! I can't wait to see what developers do with this...
 
The last AC game we evaluated was AC3 in 2012. It was pretty much a waste of time even then, and yes, is still with Black Flag.

Brent, so when you say "pretty much a waste of time" do you mean as far as a game you just don't like, or one that has nothing special from a graphical point of view for modern games?

While the AC games have been hit or miss for me (mostly miss), Black Flag has been a blast to play and the graphics do look pretty good.
 
Brent, so when you say "pretty much a waste of time" do you mean as far as a game you just don't like, or one that has nothing special from a graphical point of view for modern games?

As noted earlier in this thread, AC4 has a hard-coded limit of 62 FPS. That kills it for benchmarking purposes, regardless of the quality of the graphics or gameplay.
 
I dont know what it is but every game that has mantle makes me put it on a games to buy list. Maybe Its clever marketing with a mixture of Specific Hardware.
Kindof like DX 10/11 was. But it still needs to be done proper. Same story i guess.
Still curious about this mantle stuff... even though i notice its the same story again. I guess I cant change.
 
So AMD now is working on integrating Mantle into the Unreal Engine, commendable. I would say this is more newsworthy than the game itself.

The more game engines supported, the better, even older, well known ones.
 
I agree, if they could integrate the unreal engine it would give current amd owners a boost in performance on a whole lot of games.
 
So AMD now is working on integrating Mantle into the Unreal Engine, commendable. I would say this is more newsworthy than the game itself.

The more game engines supported, the better, even older, well known ones.

Hmm? I don't believe your statement is correct. The work in Thief and Mantle was done entirely by Nixxes / Eidos alone for Thief and only Thief; I don't think i've heard anything about Epic (creators of Unreal Engine, obviously) being interested in Mantle. Actually.... The creator of Unreal Engine expressed outright hostility towards the idea of Mantle integration, specifically Tim Sweeney thought it was a bad idea and he would not use it at all in UE3/4 - he favors a more DirectX centric approach, and MS is adding more direct GPU access with whatever the next iteration of DX is, apparently.

Here's a youtube of Tim Sweeney (UE3/4 Engine creator) discussing Mantle along with Johan and John Carmack:

http://www.youtube.com/watch?v=3RF0zgYFFNk

Carmack and Sweeney both were really against GPU specific APIs for PCs, while Johan. Well we obviously know where he stands. ;)

Not saying you're wrong, but Tim Sweeney was really dead set against Mantle at his last press appearance so it seems doubtful - and like mentioned, 100% of the work on Mantle for thief was done by Nixxes and/or Eidos, not Epic themselves (creators of Unreal Engine). Of course Tim Sweeney may have had a change of mind, so I was just curious if you had a source - if that's true it's cool for you AMD guys, but Sweeney would really being doing a 180 if he did change his mind about Mantle. He wasn't a fan last he spoke of it..but , like I said, if he did have a change of heart it would be awesome for AMD users. UE3 / 4 would be huge since it is basically THE biggest multiplatform engine for both consoles and the PC....
 
Last edited:
UE3 is pretty extensible, so Nixxes could potentially license out the UE3 mantle libraries to other developers bypassing Tim Sweeny and Unreal completely..

It would be much cheaper from a development standpoint to pay license fees than redevelop the libraries from scratch.
 
UE3 is pretty extensible, so nixxes could potentially license out the UE3 mantle libraries to other developers bypassing Tim Sweeny completely..

Yeah well, aside from the rampant speculation here - it's pretty much not going to work that way since licensing Unreal Engine is not free and a 3rd party licensing out "their" version of Unreal Engine would obviously have intellectual property and litigation issues. From what Nixxes has stated in interviews their integration of Mantle didn't take an incredible amount of development time, a couple of months IIRC. So it's more than likely a non issue, but it won't be broad engine support and will be game by game support - which seems easy enough so long as AMD continues their work with developers.

Now what you're saying is not going to happen. UE3 is not free to give out or fuck with, it is Epic's intellectual property that they charge for on a per game basis. You also cannot re-sell or license someone elses intellectual property, of course. Anyway, I was mainly interested if Sweeney changed his mind. Which would be cool for the AMD guys I suppose. Which is what I gathered from Yakk's post, he seems to indicate that there was a change of heart at Epic, and like I said, that would be semi cool for you guys.
 
Tim Sweeney being green through and through is what makes this development all the more interesting! :)

Could it be AMD developed this FOR Nixxes? If so, AMD could distribute the Mantle extension for UE3 free of charge to everybody. Sounds like something AMD would do, curious on confirmation of this.

Lots of business opportunities here, especially since AMD already stated Mantle can use DX pathways already made and just bypass the CPU calls (over simplification, but you get the idea)
 
Yeah well, aside from the rampant speculation here - it's pretty much not going to work that way since licensing Unreal Engine is not free and a 3rd party licensing out "their" version of Unreal Engine would obviously have intellectual property and litigation issues. From what Nixxes has stated in interviews their integration of Mantle didn't take an incredible amount of development time, a couple of months IIRC. So it's more than likely a non issue, but it won't be broad engine support and will be game by game support - which seems easy enough so long as AMD continues their work with developers.

Now what you're saying is not going to happen. UE3 is not free to give out or fuck with, it is Epic's intellectual property that they charge for on a per game basis. You also cannot re-sell or license someone elses intellectual property, of course. Anyway, I was mainly interested if Sweeney changed his mind. Which would be cool for the AMD guys I suppose. Which is what I gathered from Yakk's post, he seems to indicate that there was a change of heart at Epic, and like I said, that would be semi cool for you guys.

you missed my point.

If a developer is already using UE3, they have already licensed it form unreal.

Nixxes could potentially sell/licence its mantle libraries (they are extensions, not part of the core engine) so the developer doesn't have to spend the development resources on creating it itself.

Taking this a step furthur, AMD could develop extentions for all popular engines itself and give them to the developers free of charge.
 
oh geez, this again?

not for a couple years they arent.

in the mean time, we have mantle.
 
Gave me a chuckle also... AMD, nvidia, and Intel are all presenting at GDC, nobody is surprised. Oxide developers even said Mantle was just a stepping stone.

Mantle is a great blueprint on which to build an open standard. Just like the VESA open standard AMD presented. All three companies has vested interest in getting this to work. It should help AMD on Mac PROs also as an example for OpenGL.

Will be a good while though, until then we have a slew of upcoming Mantle games!
 
1) UE4 isn't compatible with last-gen consoles so that is the reason we haven't seen many, if any, games using it yet. By the end of this year and into next I think we will see an uptake in how many games are using it.

2) This game actually is over 4 years old technically. At one point it was called Thief 4 and that was quite a few years ago. That is the main reason it is using UE3 and looks dated.

3) The PS4 and Xbox One are already using TrueAudio features (as both consoles have the capability implemented in different ways) for Thief. I was hoping this would be a good jump as sound as just been so stagnant since Aureal 2 and EAX 4 or 5. If this is the game meant to showcase TrueAudio it is not impressive. It isn't any better than the original Thief games or A3D games. I am still hoping this is just this game as I find it sad sound hasn't advanced but the sound in Thief on the consoles isn't impressive at all really. Forza and Outcast have much better sound environments and are great sounding games. Hopefully games follow them more and not Thief.
 
Yeah, what happened there was this - basically with the marketing of the 290X at launch and Frostbite 3, AMD made it seem like every FB3 game would use Mantle. That turned out to not be the case - while the engine has the option for Mantle functionality, it is not compulsory. And several FB3 games aren't using Mantle as it turns out - NFS isn't, PvZ i'm not sure, and all of the other FB3 games are generally not going to be released until 2015 or beyond. The one game that remains is Dragon Age Inquisition and Bioware will not comment on whether Mantle will be included or not.

I'd assume, that if AMD is pushing Mantle as hard as they could be (and they should be doing this obviously) that they would announce on their Mantle portal that Dragon Age is confirmed for Mantle. So for 2014 the remaining confirmed FB3 releases are Dragon Age and Plants vs Zombies - PvZ may get it (although - I can't see this being all too useful? It isn't a demanding game. Maybe for APUs and tablets??) and Dragon Age is up in the air. What FB3 games will use Mantle I am not completely sure, but it's clear that not all frostbite 3 games will use Mantle. It's an option but is not compulsory for any developer using FB3.

Of course Star Citizen will be a big net win for Mantle, although that again wont' have all modules completed until 2015. But with the history behind Wing Commander back in the day that's going to be pretty awesome for AMD users.

What i'm curious about is whether AMD will opt to use Mantle in perhaps a different way - anyone remembering glide from way back in the day, will remember that some Glide games didn't have significantly faster performance than their DX counterparts but they *did* have specific graphical features which D3D did not. I'm curious as to whether AMD could implement stuff like that - that would be cool for say, Thief, which isn't a super demanding game on it's own. But if they could crank the dials to 11 in terms of graphical features above and beyond the baseline, that would be a pretty cool way to use Mantle as well. I guess, that could make sense in games where all out framerate isn't super important when the baseline engine isn't very demanding, as in Thief's case. Anyway, i'm just thinking outside of the pure framerate mindset. Obviously Mantle is beneficial for AMD hardware, perhaps AMD could leverage it in different ways than all out framerate. Framerate of course makes complete sense for demanding games like BF4, but maybe in the case of Thief, upping the graphics to 11 would be even better - since Thief already runs great on a wide variety of hardware.
 
First, I apologize if my post was negative, it was not meant to be. I was providing a "review" of your preview and some of what I would like to see (I'm just one person, however I have I feel a very common setup). Don't get me wrong, you guys do a bang up job, I have noticed a trend towards nitpicking and would like to step back and say, good job [H] over the years!

Second, I received a copy of Thief with my Ruby Rewards bundle and installed it last night...not anywhere near what I was expecting, not anywhere near my experience when the first one came out. Graphics are subpar, script is subpar; heck there was a prompt to "Hit E to vault over small objects" or something and I thought they were trying to get me to leap over an end table, turns out it was a prompt for the window which was like 10 feet away. I ran in circles spamming E for a few minutes to try and vault over the chair etc, which would have been cool to add as a feature. Crouched and able to jump over stuff. Anyway, first impressions are not good, but it ran well on an R9 270X/FX-8320. I am hesitant to put any more time into the game, but I probably should just because it was free. We'll see how it goes.
 
1)
3) The PS4 and Xbox One are already using TrueAudio features (as both consoles have the capability implemented in different ways) for Thief. I was hoping this would be a good jump as sound as just been so stagnant since Aureal 2 and EAX 4 or 5. If this is the game meant to showcase TrueAudio it is not impressive. It isn't any better than the original Thief games or A3D games. I am still hoping this is just this game as I find it sad sound hasn't advanced but the sound in Thief on the consoles isn't impressive at all really. Forza and Outcast have much better sound environments and are great sounding games. Hopefully games follow them more and not Thief.

I still have my old Diamond Aureal Vortex 2 somewhere. A3D is (imho) FAR superior to anything we have nowadays. I loved the realistic occlusion effects and material based reverb and the positional audio was spot on.
I regret having to move to EAX and creative cards until I got a little closer to the A3D experience with my Xonar D2X. But even then the positional audio is off and there is no sound occlusion.
Aureal 3D had it down 10 years ago. WTF happened?
 
Dolby surround happened? 3d sound stage/3d sound/positional audio still exists on practically every high end audio chip/chipset. Most PC gamers simply use the cheapest onboard motherboard sound they can buy and don't use it, though. Anyway, true audio could be compelling but I don't think it's a great feature to add to a GPU. Don't get upset or anything, just my opinion. 3d sound still exists but most PC gamers simply ignore it.

That said, I bought a Diamond Monster sound 3d with the A3D chip way back in the day and was also completely blown away - I still have both that card and a Sound Blaster AWE 64 gold in my garage somewhere. ;) Speaking of Diamond. MAN. Diamond MM was the shit back in the late 90s, the current modern day Diamond is NOT the same company. Diamond was more or less the "EVGA" of the late 90s, by far the best brand of anything to buy. Then they went out of business and some other firm bought them - they haven't been the same since. ANYWAY....To this day I refuse to use motherboard sound because I do notice a big difference in sound quality between the high end discrete soundcard and motherboard audio. Realtek audio doesn't cut it for me.
 
If anybody uses headphones you can get "fake" 3d audio if you set windows and the game to 5.1 and in the headphone settings set it to headphone(or stereo) and enable whatever 3d audio algorithm you use (dolby headphone, 3d cmss, or that other one I forget). Not as accurate as true 3d audio but it gets pretty fuckin close. Sound quality take a small hit too due to the algorithms distortion.
 
Not that many actually, Only 4, two of them have passed already. What remains is SniperElite 3 and StarCitizen, there were many possible FrostBite games planned at the Dice presentation , but that was scrapped (for example Plant Vs Zombies) we never heard from them again.

http://www.amd.com/us/products/technologies/mantle/Pages/mantle.aspx#2

If you are referring to PvZ: Garden Warfare, not only is it NOT scrapped, it's been released on XB1 already (the PC release is due in June 2014).
 
Dolby surround happened? 3d sound stage/3d sound/positional audio still exists on practically every high end audio chip/chipset. Most PC gamers simply use the cheapest onboard motherboard sound they can buy and don't use it, though. Anyway, true audio could be compelling but I don't think it's a great feature to add to a GPU. Don't get upset or anything, just my opinion. 3d sound still exists but most PC gamers simply ignore it.

That said, I bought a Diamond Monster sound 3d with the A3D chip way back in the day and was also completely blown away - I still have both that card and a Sound Blaster AWE 64 gold in my garage somewhere. ;) Speaking of Diamond. MAN. Diamond MM was the shit back in the late 90s, the current modern day Diamond is NOT the same company. Diamond was more or less the "EVGA" of the late 90s, by far the best brand of anything to buy. Then they went out of business and some other firm bought them - they haven't been the same since. ANYWAY....To this day I refuse to use motherboard sound because I do notice a big difference in sound quality between the high end discrete soundcard and motherboard audio. Realtek audio doesn't cut it for me.

True - Diamond Multimedia was basically the branded alternative to ATI or Matrox (and even Number Nine at the high end) in graphics, and to Creative in audio. I had mentioned the Monster 3D II (that I still have) - I also had a Monster Sound (my second PCI sound card) that actually replaced an Ensoniq AudioPCI. However, Diamond got caught in successive meltdowns in back-to-back-to-back - among those were Aureal and S3 (and 3dfx' own acquisition of STB Systems, screwing over all their distributors - including, if not especially, Diamond - in the process). Aureal had the best MIDI of anyone during the 1990s - Tyrian (both the original and the later Tyrian 2000) was one game that was extremely useful in showing how good the Aureal MIDI was, compared to anything Creative. However, fewer and fewer games took advantage of MIDI during the same period - which also marked the beginning of the ascendancy of Windows gaming; which basically spelled doom for Aureal.
 
Erm. MIDI? You are aware that Aureal had nothing to do with MIDI sound correct? I'm just really confused by your mention of that, as far as I can tell Aureal came to fruition based on their 3d audio chips which were unprecedented at the time. MIDI died because CD audio soundtracks, MP3s, and digital audio became the norm, whereas the early 90s did not have audio CDs as being commonplace so they used soundbanked files with MIDI instead. And on that note, the best MIDI audio was always the Roland SCD-15 (among other Roland products) daughterboard for the sound blaster audio cards.

I'm really confused with your alignment of MIDI and Aureal. Aureal brought us 3d audio. MIDI has absolutely nothing at all to do with 3d audio, MIDI was displaced when larger HDDs, digital audio, MP3s, and Audio CDs became commonplace.
 
Erm. MIDI? You are aware that Aureal had nothing to do with MIDI sound correct? I'm just really confused by your mention of that, as far as I can tell Aureal came to fruition based on their 3d audio chips which were unprecedented at the time. MIDI died because CD audio soundtracks, MP3s, and digital audio became the norm, whereas the early 90s did not have audio CDs as being commonplace so they used soundbanked files with MIDI instead. And on that note, the best MIDI audio was always the Roland SCD-15 (among other Roland products) daughterboard for the sound blaster audio cards.

I'm really confused with your alignment of MIDI and Aureal. Aureal brought us 3d audio. MIDI has absolutely nothing at all to do with 3d audio, MIDI was displaced when larger HDDs, digital audio, MP3s, and Audio CDs became commonplace.

I'm referring to MPU-401 MIDI in gaming (but even then, mostly DOS gaming) - not niche daughterboards like the Roland SCD-15 (which was a niche entirely because it was a daughterboard - daughterboards are a low-volume item, even today). CD audio soundtracks, MP3 audio, etc., rose simultaneously with Windows-native gaming - all of which diminished the importance of MIDI. I am saying that Aureal-based cards excelled in both 3D audio AND MIDI - nothing else (in their price range) excelled in both areas at once - heck, the closest in MIDI was the Gravis UltraSound.

However, of what use was 3D audio to the average gamer? With the advent of all those features we have both pointed out, what use was MIDI? Where Aureal excelled became not merely less important, but largely a non-issue.
 
Roland popularized MIDI through its synth keyboards and MPU-401 sound modules, but towards the end of the golden age of MIDI, Turtle Beach actually had better products.
 
I'm referring to MPU-401 MIDI in gaming (but even then, mostly DOS gaming) - not niche daughterboards like the Roland SCD-15 (which was a niche entirely because it was a daughterboard - daughterboards are a low-volume item, even today). CD audio soundtracks, MP3 audio, etc., rose simultaneously with Windows-native gaming - all of which diminished the importance of MIDI. I am saying that Aureal-based cards excelled in both 3D audio AND MIDI - nothing else (in their price range) excelled in both areas at once - heck, the closest in MIDI was the Gravis UltraSound.

However, of what use was 3D audio to the average gamer? With the advent of all those features we have both pointed out, what use was MIDI? Where Aureal excelled became not merely less important, but largely a non-issue.

You said that Aureal had the best MIDI, and I was only addressing this statement by you. Aureal did not "do" MIDI. I know very well why MIDI died, and I know which MIDI products were good back in the day. This is all ancient history though so I have no interest in arguing which MIDI products were the best and why MIDI died. The bottom line is this: Aureal did not do "MIDI". They created 3d audio for the mass market, and their A3D product was revolutionary at the time. Their stride in positional audio went on to inspire stuff like Dolby 3D and other such technology, so I have to give props to Aureal even though they're long gone. They chip itself had nothing whatsoever to do with MIDI. I was confused by your statement "Aureal had the best MIDI" which is, in fact, not true. They created the world's premiere 3d audio chip. Nothing at all to do with MIDI. In fact, when the A3D was released to the world everyone was well into the transition towards CD audio and MP3s. The A3D did not need MIDI, because the world did not care about MIDI anymore. In fact, from what I remember, the vast majority of Aureal A3D based cards used software based MIDI for emulation, much like the earlier Sound Blaster 16 cards.

I'm only addressing the fact that Aureal did not have the "best MIDI" (your statement), the A3D had nothing to do with MIDI. ;) A3D cards used software emulated MIDI which was completely shitty just as the SB16 cards did. Any A3D cards *with* good MIDI used a daughterboard for the soundbanks - from what I remember, some vendors did just that and others (most) didn't. The soundbanks of course were not created by Aureal, they were implemented by vendors and was a completely optional feature. Just as daughterboards on the SB16 were completely optional. That is all. I was just confused by your statement.

On another note - I did smile at your mention of Ensoniq - I remember them very well. :) They had some pretty sick stuff back in the day. I think I had the Ensoniq Soundscape? I was a little tyke back then, heh. I got my DIY fix early on. But I did love that soundcard, it was very very good. Turtle Beach had some great stuff too but they were more oriented (mostly) towards the professional market - I seem to recall their stuff being very, very costly.
 
Last edited:
As someone who works and has friends and family who work with software (medical, games) it kind of hurts to read a reviewer criticizing a game that took a lot of effort. OK not everyone can like the game but it's tough to read nonetheless.
 
You said that Aureal had the best MIDI, and I was only addressing this statement by you. Aureal did not "do" MIDI. I know very well why MIDI died, and I know which MIDI products were good back in the day. This is all ancient history though so I have no interest in arguing which MIDI products were the best and why MIDI died. The bottom line is this: Aureal did not do "MIDI". They created 3d audio for the mass market, and their A3D product was revolutionary at the time. Their stride in positional audio went on to inspire stuff like Dolby 3D and other such technology, so I have to give props to Aureal even though they're long gone. They chip itself had nothing whatsoever to do with MIDI. I was confused by your statement "Aureal had the best MIDI" which is, in fact, not true. They created the world's premiere 3d audio chip. Nothing at all to do with MIDI. In fact, when the A3D was released to the world everyone was well into the transition towards CD audio and MP3s. The A3D did not need MIDI, because the world did not care about MIDI anymore. In fact, from what I remember, the vast majority of Aureal A3D based cards used software based MIDI for emulation, much like the earlier Sound Blaster 16 cards.

I'm only addressing the fact that Aureal did not have the "best MIDI" (your statement), the A3D had nothing to do with MIDI. ;) A3D cards used software emulated MIDI which was completely shitty just as the SB16 cards did. Any A3D cards *with* good MIDI used a daughterboard for the soundbanks - from what I remember, some vendors did just that and others (most) didn't. The soundbanks of course were not created by Aureal, they were implemented by vendors and was a completely optional feature. Just as daughterboards on the SB16 were completely optional. That is all. I was just confused by your statement.

On another note - I did smile at your mention of Ensoniq - I remember them very well. :) They had some pretty sick stuff back in the day. I think I had the Ensoniq Soundscape? I was a little tyke back then, heh. I got my DIY fix early on. But I did love that soundcard, it was very very good. Turtle Beach had some great stuff too but they were more oriented (mostly) towards the professional market - I seem to recall their stuff being very, very costly.

You are speaking in terms of studio-quality MIDI - darn right that was pricey. I have actually heard MIDI from a Roland SCD-15 - however, the only dealer to actually sell them never stocked them, and they were horribly expensive, and therefore not really suitable for gaming. (It's also why I qualified my statement to "in that price range" - the Gravis UltraSound was the only other card in that price range with decent default MIDI. The SoundBlaster AWE line - original AWE32, SoundBlaster 32, and AWE64, DID have better MIDI; however, they adopted the same SoundFont process that Roland made famous - in fact, it was licensed from Roland. The AWE32 and AWE64 used proprietary memory modules, while the SoundBlaster 32 used standard DIMMs - this was the card that the AudioPCI, and later MonsterSound - would eventually replace. The AWE cards (due to pricey modules) were horrible sellers; the SoundBlaster 32 did sell better, however. What eventually did in the AWE cards (and the SoundBlaster 32) was the death - in fact, murder - of ISA except for niche use.)

Do you remember HammerSound? They had commercial, shareware, and even free SoundFonts for any hardware that used the Roland SoundFont spec - and they could get seriously large. (I remember having their 4 MB and 8 MB Roland-sourced GM/GS SoundFonts for my SoundBlaster32.)
 
The sound blaster awe32, 64, etc were not better than the SCD-15. These statements you are making, I don't know where you're reading this stuff but you are wrong especially about the AWE32. That card had 512MB of soundfont memory and was godawful, Roland was the gold standard reference by which everything was compared. MIDI soundtracks were actually composed on high end Roland hardware as well. Anyway, who the hell cares about MIDI. Long dead. ;)

My only point was that Aureal did not do MIDI, (your statement.) Aureal made their name not with MIDI, but with their 3d positional audio. I really felt that the A3D was a groundbreaking product with many imitators - that was a great time for PC audio. Sadly, PC audio has gone to the wayside as of late, nothing exciting going on. Maybe true audio will be something interesting, but audio is literally the last thing I care about on GPUs. And Dolby surround has been around for ages, yet most PC gamers just use the shittiest motherboard sound possible while ignoring dolby surround (which is quite good).
 
Last edited:
Hi,
with the system in signature (GTX580 SLI) I do this results maxed out at 1920x1200.

Min 17FPS
Avg 55FPS
Max 79FPS

If you compare my results with a GTX750 at 2560 it seems that a single GTX750 scores better than a single GTX580. Is this possible?
 
it seems that a single GTX760 is better than my GTX580 SLI, doesn't it strange?
evga precision doesn't work on this game, do you know if it use SLI well?
 
Back
Top