Recent content by SickBeast

  1. S

    CPU overclocking using offset voltage

    Hey have any of you guys ever done this? I'm using a -100mV offset to overclock my Intel 8700k to 4.8ghz. My understanding is that overclocking using this technique lowers all the various Speedstep voltages by -100mV also, which could affect CPU stability at idle or during light workloads when...
  2. S

    Intel Is "Very Pleased" with the Progress of Its 7nm Process

    What about 10nm? I guess Intel doesn't want to talk about that? Talk about a corporation getting caught flat footed. Why haven't they developed an ARM product by now? AMD is going to stomp Intel this year, and then they'll both get stomped by Apple and whoever else can develop a superior ARM...
  3. S

    AMD's 7nm Products Provoke Interesting Industry Responses

    Thanks OP. Look at what nVidia is dealing with with the 12nm node. We are approaching the limits of physics. [H]ard things are [h]ard. And advancing microprocessor manufacturing is damn [h]ard.
  4. S

    NVIDIA Gives the Bird to All RTX Early Adopters

    I clicked on the thread title hoping nVidia came to their senses and re-named the RTX 2080 Ti a Titan and cut prices across the board. Just a free game. Too bad.
  5. S

    PlayStation 5 Rumored to Sport Ryzen 8-Core CPU, Cost $500

    Well what would be the point in Sony releasing the PS5 with anything less powerful than a GTX 1080? The Xbox One X is already about as powerful as a GTX 1070.
  6. S

    PlayStation 5 Rumored to Sport Ryzen 8-Core CPU, Cost $500

    Until you can show me something made by AMD that can compete with a GTX 1080 without consuming close to 500 watts, I call full on shenanigans to your post. You want to compare a 30w APU to a full blown next gen console GPU? LOL.
  7. S

    PlayStation 5 Rumored to Sport Ryzen 8-Core CPU, Cost $500

    What about the graphics? I don’t think AMD can make anything competitive with any type of reasonable power consumption for a console.
  8. S

    Crypto Hangover Could Take Blame for NVIDIA's Potential GeForce RTX 2060 Delay

    If you read the link I posted it says they are using GTX 1070 chips for it. Maybe this will be a GTX 1060 Ti.
  9. S

    RTX 2080 Ti GPUs Go Missing at AIBs

    I'm wondering if this situation is bad to the point of requiring a re-spin. A new stepping.
  10. S

    NVIDIA on the Cause of RTX 2080 Series Card Failures

    I have been advised by someone extremely in the know to keep all my assets as cash at this time. Tread lightly. I asked about blue chip stocks even. He said no.
  11. S

    Crypto Hangover Could Take Blame for NVIDIA's Potential GeForce RTX 2060 Delay

    What about the GDDR5X GTX 1060? Was that a myth? It even showed up on a board partner's website IIRC. https://www.notebookcheck.net/Nvidia-s-new-GTX-1060-with-GDDR5X-memory-is-confirmed-to-use-the-same-die-as-the-1070-1070Ti-and-1080.361441.0.html
  12. S

    Crypto Hangover Could Take Blame for NVIDIA's Potential GeForce RTX 2060 Delay

    I'm just pleased that gamers are showing nVidia that we aren't stupid. nVidia tried to abuse their market position and the PC gaming community said "no way". Now nVidia will reap what they sow. What are they going to do, cut the prices on the Turing cards now? That will alienate their most...
  13. S

    Crypto Hangover Could Take Blame for NVIDIA's Potential GeForce RTX 2060 Delay

    I would love to see nVidia's answer to the new $280 RX 590 that includes three free games and mops the floor with even an overclocked GTX 1060.
  14. S

    Real-Time Ray Tracing Is Enabled in Battlefield V

    nVidia is telling people that only RTX "low" is optimized right now. They are working on fixing the higher settings. They are recommending using RTX "low" at this time. From nVidia:
  15. S

    AMD Comments on DirectX Raytracing Support

    I really think nVidia caught AMD flat footed here. And how could they not have, really. AMD has already been struggling to compete in terms of performance. There is no way they can develop their own RTX technology. They are going to try to reverse engineer what nVidia has done using whatever...
Top