Hey have any of you guys ever done this? I'm using a -100mV offset to overclock my Intel 8700k to 4.8ghz. My understanding is that overclocking using this technique lowers all the various Speedstep voltages by -100mV also, which could affect CPU stability at idle or during light workloads when...
What about 10nm? I guess Intel doesn't want to talk about that?
Talk about a corporation getting caught flat footed. Why haven't they developed an ARM product by now?
AMD is going to stomp Intel this year, and then they'll both get stomped by Apple and whoever else can develop a superior ARM...
Thanks OP. Look at what nVidia is dealing with with the 12nm node. We are approaching the limits of physics. [H]ard things are [h]ard. And advancing microprocessor manufacturing is damn [h]ard.
I clicked on the thread title hoping nVidia came to their senses and re-named the RTX 2080 Ti a Titan and cut prices across the board. Just a free game. Too bad.
Well what would be the point in Sony releasing the PS5 with anything less powerful than a GTX 1080? The Xbox One X is already about as powerful as a GTX 1070.
Until you can show me something made by AMD that can compete with a GTX 1080 without consuming close to 500 watts, I call full on shenanigans to your post. You want to compare a 30w APU to a full blown next gen console GPU? LOL.
I have been advised by someone extremely in the know to keep all my assets as cash at this time. Tread lightly. I asked about blue chip stocks even. He said no.
What about the GDDR5X GTX 1060? Was that a myth? It even showed up on a board partner's website IIRC.
https://www.notebookcheck.net/Nvidia-s-new-GTX-1060-with-GDDR5X-memory-is-confirmed-to-use-the-same-die-as-the-1070-1070Ti-and-1080.361441.0.html
I'm just pleased that gamers are showing nVidia that we aren't stupid. nVidia tried to abuse their market position and the PC gaming community said "no way". Now nVidia will reap what they sow. What are they going to do, cut the prices on the Turing cards now? That will alienate their most...
nVidia is telling people that only RTX "low" is optimized right now. They are working on fixing the higher settings. They are recommending using RTX "low" at this time.
From nVidia:
I really think nVidia caught AMD flat footed here. And how could they not have, really. AMD has already been struggling to compete in terms of performance. There is no way they can develop their own RTX technology. They are going to try to reverse engineer what nVidia has done using whatever...