Yeah, I'll never forget back in the day when Kyle, Anand, and I think Tom (correct me if I'm wrong) collaborated with their samples to discover the P3 1.1+ghz chip faults. And had a top tier product line pulled back from the biggest CPU maker in the world.
It kind of feel like Kyle is the only...
The biggest problem with this review is that a GTX1070 will bottleneck a lot of those chips. We need to see something on a 1080ti or even a Vega 64 LC to see something that can push more frames.
There have been reported memory latency improvements which will be good.
Also, there are reported...
What all is the rig going to be used for?
1950X might be neat, but I'd shy away from it if gaming is your aim... My Ryzen 7 is usually underutilized in games right now, only a few actually spread the load (and when it does, it's beautiful).
If you're on 17.9.1 or 17.9.2 I am hearing a lot of bitching about clock instabilities on various forums with these currently (where 17.8.2 was stable at a clock and now crashes).
This x10.
I tried the AMD "clean install" option which uninstall the drivers and reinstalls after a reboot... still was having the "Enhanced Sync" missing issue.
Only after a DDU in safe mode and a reinstall to that same driver level I just tried did the Enhanced Sync come back.
I did have that happen on my RX470, but not since I updated to my Vega 64 on the latest driver.
If you turn off Hardware Acceleration, it should stop. You can usually re-fullscreen, then it'll fix after a few seconds.
I believe it's a known issue they're working on.... maybe report it on the...
I mean... maybe.
But I seem to remember this being the first GTX 1080 "custom" card actually pictured... so I'd hold out judgment on whether or not we're getting decent custom boards.
Ryzen doesn't pull through RTSS / Afterburner right now as far as I can tell.
But you can configure HWinfo64 to push its temperature readings into RTSS.
Anyone else use HWinfo64? I was getting 3-4 second interval stutters on my machine while it was running, even tried updating to the latest version and it was still persisting.
I had it in the background launching at startup to feed RTSS for OSD info about my CPU temps, and it was driving me...
Got it all set up. I popped the fan off, reversed it, put the sticker on the backside and it's pulling through the P3's dust filter like my Kraken.
Thanks for the stock money AMD, I put it to good use :)
And we will suffer for it.
If Intel hadn't fucked AMD all those years ago, things would be a lot different I think.
AMD is at least putting forth more effort now to get tools out there, it'll be a game of catch up though.
And depending on the needs, the AMD compute cards can still make sense...
They stepped into the monitor market with a proprietary hardware integration piece, so don't play coy.
NV holds the hands of developers with money, gotta spend money to make money right? But the black box approach is anti-competitive behavior when it excludes your competitor from optimizing to...
Nvidia did some shit like, oh, disable Physx if it detects an AMD card in the system, even if you also had an Nvidia card to run Physx with. Which by the way, is bullshit anyway. When they acquired the Physx property, they retroactively added the feature to year or two old chips, so it's not...
x86-64 was the best way to move the market forward towards 64-bit OSes and applications.
It let people exist on their current platform and eventually switch up to 64-bit, saving enormous costs for everyone.
Period.
Integrating the memory controller into the CPU was a significant performance...
Statistics mate, how does it work?
If only we had someone's account recorded in digital format regarding the pricing situation from the time.... oh look at that another person used the written word? (The "Why Wait" section of the article if you're impatient)
Golly, imagine that.
it's the average, Buttons posted a contemporaneous article talking about the significant price spikes on the GTX 1070 and 1080.
If you still can't see, then you're just admitting to be a troll at this point.
We're not even two weeks into just the Vega 64, let alone the Vega 56! Yeah it'll...
Look at the graph.
LOOK AT THE LEGEND.
Looking at a per product listing is going to be horribly inaccurate, as the SuperBiiz listing for "$650" is still on the graph up to the present, LOOK AT THE GRAPH. But it's "Out of Stock", which means you can't buy it.
Per vendor pricing is not...
Yeah, I kept the power draw in mind when I wired my new case setup for the Vega I'm awaiting. Definitely went with two 8 pin cables, felt comfortable with more copper running to the card rather than less.
But if a far more efficient, lower tier card comes from Nvidia, they wouldn't want to shift older cards to the used market and move to a newer CUDA Volta chip?
Just curious.
Doesn't this forebode GTX1070's being the next bang/bust GPU mining victim? What happens when the Volta XX60 card comes out at about GTX1070 perf, and at lower power consumption? Won't miners dump into the market, crashing Nvidia's used card market pricing just the same as what happened to AMD...
Some undervolting results for RX Vega 56 and 64:
https://www.hardwareluxx.de/index.php/artikel/hardware/grafikkarten/44084-amd-radeon-rx-vega-56-und-vega-64-im-undervolting-test.html
You'll have to use Google Translate :P
If history remembers correctly, the Bitcoin GPU craze kind of lasted from 2010 to 2013. We just started this crazy... 6 months ago tops?
Yeah, there will be at least one full product cycle released under these demands.
This is a fair point, one can hope they get production costs down to something manageable at that point.
if not, then they'll have to limit production, let prices artificially inflate, and hope for a fast Navi and/or good Vega 11 (if such a thing exists) with more mainstream characteristics...
That's a poor agument, how long did the last mining spree last back when the 290X was getting pushed to $900 MSRP?
How many currencies are there now? How much more developed that economy?
It'll be even worse.
If the "low cost" HBM becomes a part of APU's in the future as a 1-2gb HBC in the future, say with just a single 512-bit HBM memory interface, things could interesting.