I wonder if the X based models are going to be more heavily binned than last time around. Only basing this off of the fact that a few non-X buyers are reporting 4ghz max overclocks while the 2700X owners seem to be able to to hit 4.3 with enough (probably unsafe) voltage.
So much debate over early half assed benchmarks.. wait a few weeks and get the real thorough reviews and save yourselves the headaches of arguing over this stuff.
Fury X to me felt like a bit of a proof of concept / trial run for HBM. As an R9 290 owner, I remember getting the upgrade itch at the time and feeling a bit let down by Fury and Fury X. For one, I wanted something with more VRAM and the performance upgrade didn't seem worthwhile enough to...
It's interesting that instead of making a statement to the community to clear up any misconceptions about the GPP program NVIDIA is instead apparently contacting websites and trying to get them to dismiss or put a negative spin on the story. If it was all just a big misunderstanding and there...
Yeah, that's what I said in the post ;) AMD is clearly trying to put pressure on NVIDIA by bringing the story to the media, but if what Kyle is saying is true there could be legal action down the road as well.
Because there are a lot of people that keep coming back into the debate, 17 pages in, making the same arguments that entirely miss the point of Kyles article, causing these arguments to go in circles as we have to explain the basics of the entire issue to them.
And they can decide to stay away...
I've never seen so many people salivating at the potential for a market leader to become an outright monopoly. The amount of comments from people trying to spin this into another "good business move" from NVIDIA is pretty telling. I mean the AIB's have been able to market their products as they...
I don't think this is necessarily true. Sure, NVIDIA might want you to differentiate it's products, but Asus wants you to buy an Asus, regardless of whether or not it's a "geforce" or if its "radeon"`. They want to establish the ROG brand so that you'll buy ROG products regardless of what chip...
The program details are pretty vague right now, obviously we're all guessing at how it could be implemented. If it was limited to something along the lines of what your suggesting I could see it not being as big of an issue. But if it was as simple as that, I don't think the AIB partners that...
I don't think there's any concern that they wouldn't remain an AIB partner, the concern is that they're essentially being pushed into joining or becoming a second class AIB partner while all the AIB partners in the program get preferential treatment.
To control the marketing, the brand recognition, to further push AMD into the "other", "budget", not the "default" gaming GPU that NVIDIA want's gamers to associate them with. That's a problem. "Oh hey Asus, you want to launch your GTX 2080 ROG Strix at the same time as EVGA's flagship? Ah shit...
With NVIDIA's current market share being what it is, no one is going to do that. That's exactly the problem. They're using their current market share advantage to try and get an even larger stranglehold on the market. Is Asus going to give NVIDIA the middle finger and lose 70% of their GPU...
Let's be real here though, those benefits could leave the vendors that don't participate in a serious disadvantage. It might come to a point where they really don't have the option of not participating if they want to remain competitive. This program really isn't great if we want the GPU market...
People already associate Geforce with NVIDIA, so when someone buys an Asus ROG Geforce whatever, they know what they're getting. Asus has spent time and money developing the ROG brand and if you look at their product lineup the vast majority fall under that ROG Strix category. This basically...
The only thing I think people should consider is, while they definitely could be fake, I wouldn't put it past an AMD PR guy to whip this thing together by taking an existing slide and just modifying it to your new/current lineup. Laziness is a thing among professionals in any industry :p
If the images are fake then that's disappointing - mainly because I was pleasantly surprised to even see 4.3ghz boost on the higher end model. My original guess was that we might see 4.2 or something along those lines in a best case scenario.
I'm really curious as to how binned the processors will be across the lineup. Will users be able to overclock the Ryzen 5 2600 or 2600X to similar 4.3/4.4 levels that the 2700X offers? It seems like they're really trying to segment the processors by cooler as well, which is sort of an artificial...
These slides look pretty legit. Real AMD slides have had mistakes/typos before. Nothing about them screams fake, no crazy performance claims, no crazy "boost up to 4.6GHZ!!!~" claims either. People have been wondering what the justification for the 400 series chipsets would be and the slides...
I'm thinking 4.1-4.2ghz overclocks is probably what we'll see out of these chips with the golden sample/unicorn chips doing 4.3ghz with crazy voltage. 200mhz increase isn't bad, but it's also not an incredible jump. It will help close the gap a bit though.
I will be pleasantly surprised if get 4.4ghz stable overclocks out of these new Ryzen chips. If we do I think even that will be a pretty big step forward. I think more realistically we'll see higher stock clocks and some very slightly increased o/c headroom (4.1-4.2ghz on more chips than you see...
Interesting. I'll have to try it again and see if the offset is kicking in consistently or not, it was when I last tested but I'll try again soon and see what's up.
The AVX offset was kicking in under both PUBG and Heroes of the storm which I've been playing a lot of lately, I'm admittedly more GPU bound anyway since I'm using a 1080 (non-ti) but I did find it annoying since neither of these games were going to turn my system into a fireball like Prime95 would.
There's nothing wrong with using avx offsets, but I overclock to get better performance in situations that actually demand it, meaning CPU intensive applications or gaming (ones that probably use AVX instructions anyway), not to feel good about loading up Firefox or notepad faster at 5ghz. If...
It's more that I just don't think its entirely accurate to say its a 5ghz overclock if in the most demanding situations its actually running at 4.8ghz (and as I said, a lot of games that aren't even that demanding but trigger AVX instructions thus the throttling). I'm more than happy that my CPU...
I don't really feel like those overclock results are all that impressive when you consider the AVX offset. It feels like people not wanting to admit that they can't hit 5ghz and settling for 5ghz in anything not remotely demanding and 4.8ghz in everything else. I can get into windows just fine...
Well here he did point out the weird results between reviews and a lot of that came down to Multi-core enhancement. Auto-overclocking all cores to single core boost levels and publishing those results as default clocks is obviously pretty misleading, but a lot of the reviewers themselves didn't...
After 6 years a 20% increase in IPC and 2 extra cores doesn't seem that mind blowing. I think it's great that Intel has finally shaken up its product line up by moving the i3 to 4 cores and i5/i7 to 6, but it really feels like this should have happened a few product releases back already.
I have an Asus Z370 Strix-F Gaming board on the way with 16GB 3200 DDR4 CL 16 G Skill Trident as well, but haven't really been able to find a CPU for purchase to go along with them ;) Still debating between the i5 8600K and the I7 8700K. The performance of the i5 is actually really impressive...
The one situation where I can see these new CPU's like the 8600K making a meaningful difference is emulation (PS3/Wii U - CEMU) as these emulators aren't really multithreaded and much of the performance is CPU bound and not bottlenecked by the GPU. Everything else will see some gains through...
Yeah, sometime next year.. when Intel actually has some CPU's available for purchase. I was interested in an i5 8600K bet lets be real here, this is a paper launch. Suppliers in Canada are being told late October or early November for the first substantial shipment of CPU's and then who knows...
Yeah, it just seems odd that the differences mainly seem to be clockspeed. If you clocked an R9 290 to a 290X, there would still be a 5% fps difference, nothing huge but it was there. There was a similar gap between the Fury and Fury X. But here we literally see zero difference. Really makes you...
After reading the HardOCP comparison between Vega and Fury X at the same clocks, thought this one was interesting too. The conclusion seems to be that there is literally no difference in performance between Vega 56 and 64 when they are at the same clocks, regardless of the fact that the 56 is a...
I can't help but feel that AMD really shot themselves in the foot by designing their products around HBM and then HBM2. Product shortages and availability of HBM1 caused supply issues for Fury/FuryX. And I was just thinking about how even a die shrink of a previous high-end product (like Fury-X)...
But I don't think the issue was ever whether or not dev's would be aware of the issue or not. It was whether or not a dev was going to go back and invest time and resources to optimize performance for existing titles rather than upcoming ones. There has been Ryzen performance patches for Dota 2...
Remember all the posts saying something along the lines of "Dev's aren't going to go back and patch old titles to improve ryzen performance" ...
http://forums.eu.square-enix.com/showthread.php?t=254796&s=ea21fd6c2d46877c189c6e647153b29d
My PSU is only semi-modular. So I was using the PCIE 6+2 pin cables already coming out of the unit to power the card. I just tried using 2 seperate PCIe cables plugged into the additional PCI-e ouputs on the modular portion of the PSU but the system still powered off during the tests :(
So, I got the issue to happen again but this time without any overclock, just an increased power limit. I set the limit to 130% (with the alternate bios switch on the EVGA FTW) and ran OCCT with error checking. System turned off after 2 minutes. Then I lowered the power limit to 120%, but still...
Corsair offers express RMA, so I wouldn't have any downtime. If I set the voltage without raising the clock, it's not necessarily going to draw the full 1.2 unless it needs to will it? I'm just wondering if that will stress the PSU in the right way. But I get what you're saying, maybe this is an...