MSI GeForce GTX 780 Ti GAMING 3G Video Card Review @ [H]

>> I couldn't tell without taking the card apart, so I don't have a direct answer on this. To me, as a card buyer, the fact that it ran as fast and cool as it did tells me that it was likely functioning as designed...<<

Thanks David,

In your review I suspected as much since there wern't photo's of the card dis-assembled. I didn't wan't to pull apart my card just to see if there were thermal pads on the VRM's or not. As you say, as long as the card is functioning without issue it's probably running as designed and shouldn't be anything to worry about. I do suspect (looking at TechPowerUp's pictures of the metal plate & thermal pads that were there) that the lack of pads on the VRM's were a manufacturing oversite on the board sample they had.
 
Kyle, question. Techpowerup mentioned "Thermal pad missing on VRM circuitry" on their review of this card, is this true with this card as well? If so, I imagine this is something done on purpose and I could anticipate having to install my own thermal pad.

Thanks
 
good review David. i like when [H]'s reviews include overclock comparisons.
 
My only bitch about this card is the "warranty void if removed" sticker on one of the HSF screws needed to disassemble the card.
 
Kyle, question. Techpowerup mentioned "Thermal pad missing on VRM circuitry" on their review of this card, is this true with this card as well? If so, I imagine this is something done on purpose and I could anticipate having to install my own thermal pad.

Thanks

Take a look at the post two before yours for my answer :eek::cool:
 
Funny that my custom ROM'ed stock 780 TI outperforms this. Once they release a custom ROM for this card though, then we'll talk turkey.

Seriously, the power governors on cards now-a-days just hurts performance and selling new cards. My custom ROM TI defeats this card by over 100mhz and exhausts air OUT of my case.

It's a great review, but once you enter the world of custom ROMs, you don't see these reviews quite the same anymore.

Isn't this card using a reference pcb? That would mean that a custom rom would work?
 
Tests were rerun - we even tried a different 290X DCII OC card to see if we could beat the OC from the one featured in the last article about it.

Thanks. The first 290X DCII oc article regarding oc fps gains was (possibly) limited by throttling of the card power delivery, afaik it hit 102c on stock clocks/volts or i'm wrong, can't remember...
 
Isn't this card using a reference pcb? That would mean that a custom rom would work?

It is a reference design with some changes in the VRM area, with some components being replaced by smaller ones than in ref.

There's pictures if the PCBs over @ techpowerup in the comments thread after W1zzard's review. Of note, he gave the card a 9.9 out of 10 for being blisteringly fast but quiet.
 
My MSI GTX 780 Ti gaming showed up today. Initial impressions are very positive.

First off, the cooler is legit. It is very quiet even at load. With +75mv and 60% fan, at heavy gpu load I am only seeing temps in the 63-67C range. 65.7% ASIC in case anyone is interested.

I am not sure, but it seems overclocking is getting held back some by the paltry 105% power limit. With a +100 core and +400 ram OC I am bouncing off it a fair bit. Just getting started, but thus far while looping the Metro LL benchmark at max details (2560x1600), I am artifact free and stable at 1210Mhz boost and 7800Mhz memory.
 
Last edited:
So my card has arrived and I finally have it working (had to update my mobo bios). I haven't installed anything except for Afterburner (only for monitoring at the moment), but GPU-Z shows the base clock to be 980 MHz. According to the [H] review, the out of the box clock should be 1020 MHz. Probably a stupid question, but am I missing something?
 
So my card has arrived and I finally have it working (had to update my mobo bios). I haven't installed anything except for Afterburner (only for monitoring at the moment), but GPU-Z shows the base clock to be 980 MHz. According to the [H] review, the out of the box clock should be 1020 MHz. Probably a stupid question, but am I missing something?

yes, your card have 3 modes, Silent mode, Gaming Mode, Overclock Mode. each mode have a different base clock and fan profile. 876mhz base, 928mhz turbo for Silent, 980mhz base, 1046mhz turbo for Gaming, and 1020mhz base, 1085mhz turbo for overclock mode..

download the MSI Gaming APP. and you can change the mode..
 
So my card has arrived and I finally have it working (had to update my mobo bios). I haven't installed anything except for Afterburner (only for monitoring at the moment), but GPU-Z shows the base clock to be 980 MHz. According to the [H] review, the out of the box clock should be 1020 MHz. Probably a stupid question, but am I missing something?

The default clock for the card should be controlled by its bios. I find it quite odd that you plugged yours in and got a 980 base (aka gaming mode) and when I plugged mine in I got a 1020 base (aka overclock mode). As araxie said, just grab the app and select overclock mode and you should be good to go.
 
The default clock for the card should be controlled by its bios. I find it quite odd that you plugged yours in and got a 980 base (aka gaming mode) and when I plugged mine in I got a 1020 base (aka overclock mode). As araxie said, just grab the app and select overclock mode and you should be good to go.

I read about the app but was hoping to not have to install it. I'll be overclocking manually anyway so I'll just leave it. Thanks guys, at least I know there isn't an issue with my setup....
 
So my card has arrived and I finally have it working (had to update my mobo bios). I haven't installed anything except for Afterburner (only for monitoring at the moment), but GPU-Z shows the base clock to be 980 MHz. According to the [H] review, the out of the box clock should be 1020 MHz. Probably a stupid question, but am I missing something?

Kepler GPUs do not have a defined out of box clock. There is a minimum "guaranteed" base/boost but every Kepler out of box will be different (and generally higher than nvidia's guaranteed clocks) - You can buy 5 back to back GTX 780s and they will all have different boost clocks - the Kepler chip itself detects what boost and base clock based on silicon characteristics.

I noticed this during the 680 era when basically every Kepler had a different boost, none of them are the same. The good news is, the boost is generally far higher than nvidia's default reference - if you buy a reference 780, for example, the stock boost is supposed to be around 900mhz. But every 780 you buy will have a considerably higher boost than that, but it will not be a "set" boost. It all depends on silicon quality, and the Kepler chip determines dynamically what the boost should be based on that. Hence every Kepler has a different boost/base clock.
 
. I find it quite odd that you plugged yours in and got a 980 base (aka gaming mode) and when I plugged mine in I got a 1020 base (aka overclock mode). As araxie said, just grab the app and select overclock mode and you should be good to go.

Mine clocked in @ 980 base with just the NVidia drivers installed also. On one of the other sites forums I think there were posts about this. It appeared that cards sent to reviewers defaulted to the faster clock rates. Purchased cards also had different bios version #'s. The bios on my card seemed to be a later version than the ones reviewed. (I'm @work now but could post my bios # later). This might be why yours defaults to faster than 980.
 
I don't know if MSI did something way different with the MSI gamer cards, but it is absolutely normal for every Kepler GPU, based on my experience (I've used 8 Kepler cards total since '12, and talked to many others with them) for every card to have a different base and boost clock. It's a characteristic of Kepler itself - it will dynamically determine the base and boost clock based on silicon quality. Now nvidia will publish the minimum boost/base as a guarantee, while every actual product you buy will have a (usually) substantially higher boost above that.

So i'm just kinda confused with all the talk of 1080 vs 1106 boosts on the MSI gamer cards...this is completely normal. Every Kepler GPU - if you buy 7 cards back to back, each one would likely have a different actual base and boost clock.

I'm just not sure if MSI did something way different to enforce a specific boost, i'm not sure. Can anyone shed light on that? It really sounds like it is normal behavior for the Kepler chip itself. When I got my first 680 sli setup I noticed that the boosts of both cards were different, and that's when I started looking into it. Then everyone I asked (over at OCN) noted the same thing. Every card boosts differently depending on silicon quality.

Anyway, just curious as to whether MSI did something to enforce a specific boost with their mods. Maybe they did, but I really suspect it's just normal Kepler behavior.
 
Mine clocked in @ 980 base with just the NVidia drivers installed also. On one of the other sites forums I think there were posts about this. It appeared that cards sent to reviewers defaulted to the faster clock rates. Purchased cards also had different bios version #'s. The bios on my card seemed to be a later version than the ones reviewed. (I'm @work now but could post my bios # later). This might be why yours defaults to faster than 980.

Something smells funny to me. I'll pull the BIOS revision from the card when I have the chance.

I don't know if MSI did something way different with the MSI gamer cards, but it is absolutely normal for every Kepler GPU, based on my experience (I've used 8 Kepler cards total since '12, and talked to many others with them) for every card to have a different base and boost clock. It's a characteristic of Kepler itself - it will dynamically determine the base and boost clock based on silicon quality. Now nvidia will publish the minimum boost/base as a guarantee, while every actual product you buy will have a (usually) substantially higher boost above that.

So i'm just kinda confused with all the talk of 1080 vs 1106 boosts on the MSI gamer cards...this is completely normal. Every Kepler GPU - if you buy 7 cards back to back, each one would likely have a different actual base and boost clock.

I'm just not sure if MSI did something way different to enforce a specific boost, i'm not sure. Can anyone shed light on that? It really sounds like it is normal behavior for the Kepler chip itself. When I got my first 680 sli setup I noticed that the boosts of both cards were different, and that's when I started looking into it. Then everyone I asked (over at OCN) noted the same thing. Every card boosts differently depending on silicon quality.

Anyway, just curious as to whether MSI did something to enforce a specific boost with their mods. Maybe they did, but I really suspect it's just normal Kepler behavior.

You're both right and not quite right at the same time. The "base" clock that is being discussed is what is set the same for every card off the line (i.e. for a particular reference model or for a particular third party card). This is the difference that is being discussed - the card that I reviewed was set to a base clock of 1020MHz, while a few others here have a base clock of 980MHz by default. The "official" boost clock also varies from card to card, but typically within a line of cards (much like the base clock), they will be the same. This is set by the firmware and is tied to the base clock. The clock that you are talking about is the "observed boost clock", which is the actual frequency that the GPU achieves when considering heat, power use, configured base clocks and the direction and velocity of the wind. So, for the review sample card (going off of memory, feel free to double check the article for exact numbers), it had a base clock of 1020MHz, a boost clock somewhere higher than that, and an observed boost clock of 1150MHz.
 
Hi guys, we are in contact with MSI to find out what is going on. I just got off the phone with Mark Tran at MSI and they are going to look into why one card might install at 1020 (OC mode) and another card might install at 980 (Gaming Mode.)

The modes supported are clearly advertised on MSI's website, and every mode advertised is covered under warranty. The modes can easily be switched between by installing MSI's Gaming App and with a one button press be changed from Gaming Mode to OC Mode. MSI suggests installing this app, and running the card at its maximum OC Mode supported on the card for the best performance. Since we tested at the advertised and supported OC Mode of the video card, our results are representative of what you will also get performance wise.

Note that this will not be the only new card from MSI supporting the MSI Gaming App. MSI's upcoming 290X custom card, which we have on the way, will also use this new 3 tier performance settings easily managed by a one button press in the software. This is new technology from MSI that all of us are going to have to get use to and keep in mind when looking at the cards performance for the future of MSI "Gaming" branded video cards.
 
Afterburner supports this card and you're bound to get a higher OC using that. I don't see what the big deal w/ stock clocks is given how easy it is to use AB.

I only got 1210 on the core, but it beats stock and my memory clocks to 7800.
 
I don't see what the big deal w/ stock clocks is given how easy it is to use AB.

I don't think anyone thinks it's a really big deal. Just a curiosity that some cards install with a different frequency than others. I think it was on TPU's forums that someone speculated (since the reviewers board had a faster base clock rate) that NVidia intentionally furnished reviewers with cards clocked higher than the cards that went to Joe public.

Brent said they're in contact with MSI to find out what's going on with the install frequencies. It's a little amusing to me that it's at least 1 day later & no answer though. You'd think that MSI Engineering would know exactly how their card installs & at what frequency it should default to. This really shouldn't be something they have to research.
 
BTW, I'm still very happy that I bought this card. Other than an initial burble with 3DMark11 it's worked flawlessly. Also nice & quiet.

I had an HD 7970 that I was pretty satisfied with. When the R9-290x's came out I got the upgrade bug & got in on the Amazon 'Deal' where their partner (Tiger DIrect) advertised a sale of about $150 off then shipped plane 290's. Since the cost to me was $50 over retail I returned the card. I had been just waiting, assuming the mining craze would peter out & the cards prices would return to retail or even under retail at which point I planned to get a 290X. During this time I saw a lot of forum posts about black screens & people twiddling voltage levels to get their 'new' R9's to work correctly. There were posts about AMD claiming the black screens were a driver problem that Amd was fixing & then posts about how it wasn't a driver problem but a power supply issue. Too many questions for me. When you pay that much $$$ the product should work perfectly, without flaws. I sold my 7970 for a nice premium thanks to the miners & was able to get this card for a reasonable upgrade cost when they went on sale for $669.
 
Hi guys, we are in contact with MSI to find out what is going on. I just got off the phone with Mark Tran at MSI and they are going to look into why one card might install at 1020 (OC mode) and another card might install at 980 (Gaming Mode.)

The modes supported are clearly advertised on MSI's website, and every mode advertised is covered under warranty. The modes can easily be switched between by installing MSI's Gaming App and with a one button press be changed from Gaming Mode to OC Mode. MSI suggests installing this app, and running the card at its maximum OC Mode supported on the card for the best performance. Since we tested at the advertised and supported OC Mode of the video card, our results are representative of what you will also get performance wise.

Note that this will not be the only new card from MSI supporting the MSI Gaming App. MSI's upcoming 290X custom card, which we have on the way, will also use this new 3 tier performance settings easily managed by a one button press in the software. This is new technology from MSI that all of us are going to have to get use to and keep in mind when looking at the cards performance for the future of MSI "Gaming" branded video cards.

I first asked this over in the review thread at TechPowerUp. I got this card and it really seems that only Reviewers got the BIOS that defaults to 1020MHz. The rest of us normal people (consumers) got the BIOS which defaults to 980MHz. Not a huge deal, except for the fact that I thought I had something wrong with card since all the reviews said base clock was 1020. Odd move by MSI.

My BIOS is 80.80.30.00.2C 980/1046.
 
Last edited by a moderator:
I first asked this over in the review thread at TechPowerUp. I got this card and it really seems that only Reviewers got the BIOS that defaults to 1020MHz. The rest of us normal people (consumers) got the BIOS which defaults to 980MHz. Not a huge deal, except for the fact that I thought I had something wrong with card since all the reviews said base clock was 1020. Odd move by MSI.

My BIOS is 80.80.30.002C 980/1046.

Odd move or calculated ?? Hopefully this wasn't some slick marketing call to try to 'pad' the reviews somewhat. It's hard to imagine that a company wouldn't think different default clock speeds would be one of the things looked at carefully by both reviewers & consumers. I'm happy with this board, but really interested in what MSI's explanation is.
 
I first asked this over in the review thread at TechPowerUp. I got this card and it really seems that only Reviewers got the BIOS that defaults to 1020MHz. The rest of us normal people (consumers) got the BIOS which defaults to 980MHz. Not a huge deal, except for the fact that I thought I had something wrong with card since all the reviews said base clock was 1020. Odd move by MSI.

My BIOS is 80.80.30.002C 980/1046.

Did anyone go nuts about how review cards were clocked higher than consumer cards like everyone did over the 290X cards? The 'fraud' panic was epic.

http://hardforum.com/showthread.php?t=1800281
http://lucca.hardforum.com/rewrite/...les&id=1&match=1&source=none&destination=none
 
Last edited:
Did anyone go nuts about how review cards were clocked higher than consumer cards like everyone did over the 290X cards? The 'fraud' panic was epic.

http://hardforum.com/showthread.php?t=1800281

Like I indicated, its hard to believe this was an intentional 'fraud'. The cards are designed for the 3 performance levels. With all the technical expertise @ various review sites & that a lot of end users have I think it's more likely that reviewers got earlier (bios) revision cards than what went on sale for the general public. Any changes by MSI's engineering department were most likely to enhance the reliability/stability/performance of the cards & not anything sinister. It's not like the consumer didn't get cards capable of the same performance that the reviewers cards achieved. The same clock rates are available thru the MSI gaming app. I'd be upset if all the reviewers cards had enhanced clock rates that weren't available in the cards sold to Joe Public, but that wasn't the case.

These cards are fast, quiet & stable. They look like they were really ready for 'prime time' when they went on sale. The 290's came with a slew of 'black screen' complaints, noise complaints, etc. No wonder people went nuts.
 
Did anyone go nuts about how review cards were clocked higher than consumer cards like everyone did over the 290X cards? The 'fraud' panic was epic.

http://hardforum.com/showthread.php?t=1800281
http://lucca.hardforum.com/rewrite/...les&id=1&match=1&source=none&destination=none

All due respect, that's a different situation. The throttling situation on the 290X cards were in the magnitude of greater than 100mhz. In quiet mode, various websites found the 290X cards were throttling by an obscene amount of 100-200mhz, whereas press samples throttled less.

Here, from what I can tell, the base clock is varying by what...20-40mhz? Honestly it doesn't matter, since you get boost clocks in games generally speaking. Just how I see it - in an actual demanding game the base clock isn't the observed clockspeed. It will be far higher so the base clock doesn't really much.

I don't know what your experience with your Galaxy 680s has been, but every kepler GPU i've used boosted well past the nvidia guaranteed boost speed. For instance, my 780s were rated at 900mhz boost or thereabouts. One of my 780s boosts to around 1100 out of box, and that is a reference PCB card. My 680s also had similar findings - I used both reference and lightning 680s which boosted well past what the nvidia guaranteed boost was. I dunno. The 290X thing is just a way different situation in terms of throttling. Now my reference 680s throttled, but never by more than 2 bins (26MHz). The lightning 680s and neither of my 780s throttle at all at load. I highly doubt this MSI card has throttling issues, that's what the press vs retail 290X variance issue boiled down to. I just don't see a correlation between this situation (if there is a situation) and the 290X press variance issue, unless i'm really misunderstanding you.
 
These cards are fast, quiet & stable. They look like they were really ready for 'prime time' when they went on sale. The 290's came with a slew of 'black screen' complaints, noise complaints, etc. No wonder people went nuts.

I don't see how other issuses have anything to do with card variances. It got so crazy that several sites felt the need to investigate it. I dont think it was fraud in either case, but AMD was POUNDED over it.
 
The situation is EXACTLY the same. Cards that reviewers had were faster than cards consumers had. PERIOD. With AMD it turned out to be a variance in fan speeds and was fixed in a new driver. Yes it is possible that in game performance of the MSI card MIGHT be the same if the BOOST clock always BOOSTs to the max value, but if that is so why have different base clocks at all? What if the BOOST is less on the cards set to game mode vs OC mode? Then the final performance would be less, just like with the 290X. AMD was unfairly pounded over the issue and [H] even came to their defense.
 
Last edited:
OK, from the TPU comments the cards do BOOST less
On stock (980/1046) actual in game speed is 1097MHz. On OC Mode (using the Gaming App), actual in game is speed 40MHz higher than than that, which is 1137Mhz
Is it a big deal? Of course not. I just think it is a shame that people are reacting calmly here, which is the correct thing to do, yet the cries of fraud spread like fire in the case of the 290X
 
Odd move or calculated ?? Hopefully this wasn't some slick marketing call to try to 'pad' the reviews somewhat. It's hard to imagine that a company wouldn't think different default clock speeds would be one of the things looked at carefully by both reviewers & consumers. I'm happy with this board, but really interested in what MSI's explanation is.

I'm interested in the explanation as well. There's a number of perfectly valid reasons for this to be happening (along with a number of perfectly invalid reasons). From a review perspective, I was rather pleased that it came configured out-of-box for the fastest mode and I didn't have to install their app to bump it up.

Did anyone go nuts about how review cards were clocked higher than consumer cards like everyone did over the 290X cards? The 'fraud' panic was epic.

http://hardforum.com/showthread.php?t=1800281
http://lucca.hardforum.com/rewrite/...les&id=1&match=1&source=none&destination=none

I don't think that these two cases are the same nor are they fair comparison. To me, there's anything worth getting butthurt over with the 290X for the reasons that I investigated and outlined in the article that you linked above.


Like I indicated, its hard to believe this was an intentional 'fraud'. The cards are designed for the 3 performance levels. With all the technical expertise @ various review sites & that a lot of end users have I think it's more likely that reviewers got earlier (bios) revision cards than what went on sale for the general public. Any changes by MSI's engineering department were most likely to enhance the reliability/stability/performance of the cards & not anything sinister. It's not like the consumer didn't get cards capable of the same performance that the reviewers cards achieved. The same clock rates are available thru the MSI gaming app. I'd be upset if all the reviewers cards had enhanced clock rates that weren't available in the cards sold to Joe Public, but that wasn't the case.

These cards are fast, quiet & stable. They look like they were really ready for 'prime time' when they went on sale. The 290's came with a slew of 'black screen' complaints, noise complaints, etc. No wonder people went nuts.

I suspect the same reliability and stability (or perhaps to reduce GPU failure rate due to the OC) reasons are the root cause of the BIOS change. The fact that the card is warrantied and supported at OC mode speeds makes this a non-issue with regards to the card I got being any better/different than the card you got other than its stock settings. The thing that furrows my eyebrows is that I want to review the card as it is being sold to the retail channel and not a variation thereof as the aftermath simply causes confusion and delay.

OK, from the TPU comments the cards do BOOST less

Is it a big deal? Of course not. I just think it is a shame that people are reacting calmly here, which is the correct thing to do, yet the cries of fraud spread like fire in the case of the 290X

I'm reacting calmly because I'm waiting on MSI's reply :).
 
I don't think that these two cases are the same nor are they fair comparison.
Why not? In both cases reviewers got different cards than consumers.

To me, there's anything worth getting butthurt over with the 290X for the reasons that I investigated and outlined in the article that you linked above.

This was the final comment on your article, it was added by Brent
It is obvious that there are differences in performance and clock speed between Radeon R9 290X cards, however those performance differences are minor enough that even the most attentive gamer would not notice these while playing. Given that, the next stop for the two GIGABYTE cards is in our daily driver rig so we can enjoy gaming on an Eyefinity setup
It seems as if the issue was over blown.
 
Why not? In both cases reviewers got different cards than consumers.

Given how the AMD boost system works, no two cards are going to be the same, and due to how they chose to present it to the public with the "Up to" terminology, they opened themselves up to the public trout slapping that they have received with it. With the AMD samples, all reference 290X's ship with a clock speed of "Up to 1GHz", and they do indeed perform "Up to 1GHz". In the current case, the cards shipped to reviewers arrived clocked to a different speed than they have been arriving in the retail channel (even though both speeds are officially supported). If you wanted to make this a similar argument for the 290X, then you'd have to clock the retail channel ones to "Up to 960MHz" but still warranty them if you pushed that number up 40MHz.

This was the final comment on your article, it was added by Brent

It seems as if the issue was over blown.

I wrote that comment. Brent's comments are found within the "Brent's Thoughts" section. At the end of the day, the 290X variances do not change the gameplay experience of the card, just as I would suspect that the Gaming mode vs OC mode clocks on the MSI card would not change the game play experience of the card (which I have not tested).
 
At the end of the day, the 290X variances do not change the gameplay experience of the card, just as I would suspect that the Gaming mode vs OC mode clocks on the MSI card would not change the game play experience of the card (which I have not tested).

Correct. So why did one situation cause wide spread panic and the other just a murmur?
 
Correct. So why did one situation cause wide spread panic and the other just a murmur?

I first caught wind of this two days ago (Friday) as I do not typically run around the internet reading all the reviews on everything (or even on the cards that I do review). We got an initial response (see above, in the thread from Brent) almost immediately and we'll continue looking into it until we feel that there is some sort of resolution to it (and we'll present that resolution to you/our readers through a channel that is appropriate). The 290X issue took a while to fester and it is also an issue that impacts every reference card produced, whereas this is limited in scope to a single manufacturer's card.
 
The 290X issue took a while to fester and it is also an issue that impacts every reference card produced, whereas this is limited in scope to a single manufacturer's card.

That makes sense. This just reminded me how unfairly AMD was treated back then.
 
Last edited:
That makes sense. This just reminded me how unfairly AMD was treated back then.

The issue was treated fairly by us, we initially reported on it when AMD came to us telling us there was a driver fix in the work for variance. That is when we posted news about it, and let readers know a driver fix was coming. Then when the driver came we decided to run hard data and testing on it, out of that we wrote an article about it to conclude the issue.

At the time it was big internet news for the at the time brand new 290X, which was the most important GPU announcement at the time, and the newest thing on the block. It was important to find out if the issue was real or not and how it affected real-world gameplay.

Born out of it, we have discovered it is appropriate to "warm up" AMD GPUs and NVIDIA GPUs alike before doing run-throughs, which is something we naturally did anyway just through the nature of how we test, but we concluded this was an important real-world aspect to take into consideration when testing performance.

The issue was big on the internet because AMD's "reported" frequency of 1GHz was not consistenly being achieved. Real-world frequency was under what was reported, and varied between video cards. This was new information, and a lot of misunderstanding on how R9 290X works. It was "advertised" as 1GHz, but 1GHz was not what you always got.

The issue with the MSI card here is about the card shipping at different default clock, as the base clock. However, the modes are advertised and displayed. In fact, no where on MSI's page does it say what the default out-of-box base clock is supposed to be. So, it could be any of the 3 modes advertised. Since it is running at one of the 3 modes advertised, it cannot be considered false advertising.

Whereas, with AMD R9 290X many would have considered that false advertising since on the packaging and specifications it never stated "Up to 1GHz" it just states 1GHz, and 1GHz is not always achieved.

If you cannot understand those two differences, then I don't know what to say.
 
Back
Top