I wouldn't be surprised if Asus did an Intel and got lazy.
However, apparently their TUF brand has gotten some notice with the 3080, and if their TUF brand is improving, I can't see any reason that wouldn't reflect on their Strix and higher stuff.
Not a motherboard, but still relevant. A good...
I raise power use limits to their max, but otherwise, I change nothing else.
I sometimes turn on whatever motherboard enhancements are available, but only the simple stuff.
Hey, I can handle complexity, but this is Ryzen. Already pressed to the wall thanks to AMD.
If it was bots, then why did Microcenter have 12 cards?
12 cards isn't a real launch. That sort of number is insane.
This must be 1.7% yields all over again.
Nvidia is in some serious trouble if they can't produce working chips.
Yeah, AMD has a huge change to rip right through into the top of the GPU market.
Nvidia looks like a bunch of greedy idiots now.
This sort of display is shameful, so now it's time to give AMD a chance to show off their graphics card.
🔥🔥🔥 !!!GO AMD!!! 🔥🔥🔥 DON'T LET US DOWN!! 🔥🔥🔥
I just got back from Microcenter!
I went down the line whispering a rumor of "Wow, in the official email, they said they have over 200 in stock. There's one for everyone!"
They had 12. >=3
(and no, I didn't get one.)
(Also, did anyone hear that rumor? I was dressed "Cyber Style," so you'd...
They're obviously playing a real nasty game here.
Corruption is at hand if this is how it works.
This calls into question Nvidia's ethical stance, as well as all the major retailers.
If they're playing this sort of game, then that's something to add to the reviews for this sort of device.
When it comes to applying thermal paste, I've seen from "transparent glass pressed against CPU" tests that a single pea is best, and with a longer processor, I imagine a bean is best.
Yes, a bean shape. Vegetables are healthy, right?
As for heatsinks, it depends what you're doing. When it...
What the HECK??
You've gotta be kiddin' me.
I mean, to be honest, I find that sort of setup incredibly SEXY.
But only hypothetically. To actually go that far seems a bit absurd, at least on my budget.
I'm stuck simply WISHING I had a UPS, but I just can't afford that sort of luxury.
Yeah, with the new Nvidia cards, and the upcoming new AMD socket and likely new Intel socket, there's a good reason to cut some bloated legacy out of the picture and start fresh and new, with an attempt to re-optimize and enhance the entire PC experience.
I've been saying it for years but yeah: Nvidia has some seriously good technology.
A pity they aren't very FOSS friendly, for what that's worth. Probably not 40 billion.
But it's still a consideration.
You don't necessarily need a "good" power supply.
Even cheaper power supplies, like the best $50 ones from Apevia don't generally cause these sorts of issues. Don't go too cheap, of course, but the best from Apevia probably competes with the worst from Seasonic, eh?
Alright, let me try.
MAJOR RUMOR ANNOUNCEMENT
There is a possibility that an UNBELIEVABLE industry leak has occurred.
And because of this potential information, DISRUPTION is at hand.
While talking to what may have been a deep industry source, I was told this:
But I am forced to write it as a...
Yeah, as soon as I read your problem, I figured it was a power issue.
Switch to a new power supply as soon as possible.
If that doesn't fix it, then check SATA cables or your motherboard. But it's probably power.
I bought a Razer aluminium mousepad for in-bed gaming on my 4K TV, and it's actually great.
I bet it'd be a breeze to glue it to some custom cut wood and then mount the wood on your armrest.
It's a nice mat, but I actually don't know if they sell them anymore. Any sort of aluminium mousepad...
Anyways, I actually love these girly GPUs. They're just so temporal, but it's so deliberate.
It stands in contrast to those serious designs that obviously still go out of fashion in a few years anyways.
At least now we know that it's meant to be a very specific and temporary...
All it'd take is a cheap x86 co-processor add-in card and now all your legacy programs would run like normal.
The main system CPU would be ARM and the GPU would be Nvidia with a proprietary PCIe style connector, but offering incredibly optimized performance.
I'm using that same Ballistix Max RAM in one of my systems, but I don't know much about how to deal with your issue.
I bought Ballistix Max for memory stability reasons. I bought RAM rated for those ultra-high speeds, but run it at 3600 and CL18 speeds.
I keep my FLCK tied with my RAM clock, at...
You're basically right. But then what is Nvidia's true aim?
They used to make motherboard chipsets, right? And they recently acquired Mellanox? Maybe they want to do something datacenter related.
It's tough to tell, but I hope Nvidia will let us know by releasing a sweet product. Did you hear...
It could be a complex game of 4-dimensional chess, where Nvidia wants to crush AMD which is linked to the fact that AMD uses TSMC, which is linked to Apple who uses TSMC, which is linked to the fact that Apple is planning to use ARM.
So this Nvidia-AMD-TSMC-Apple quad-game of hyper-linear chess...
Thermal paste can sometimes harden into a rock or crystal, and provide great heat transfer.
But if that crystal structure should crack, heat transfer drops massively.
I prefer a gooey paste instead of a hardening paste. Extremely fine silver particles, suspended in some sort of oil or solvent...
I got the humble 5600 Ultra because I heard the ones past 5600 have a single bug in a particular retro game. Splinter Cell or something.
When I built my retro rig, I simply decided that the 5600 was the right one for me. But honestly, I wouldn't say no to a cheap 5900 Ultra either. ;)
As much as I find Apple to be pretentious, I do respect them for making their own CPUs finally.
It was pathetic to see these brainwashed Apple fanatics using the exact same hardware as I use but for multiple times the price.
But Nvidia ought to keep making cards for us. I don't really want to...
If I was Nvidia, I'd be incredibly giddy over the idea of being able to make a my own console that uses my graphics cards paired with my own highly specialized CPU, to essentially build a console that's fully in-house and extremely competitive, with ultimate power. No more having to put Nvidia...
AMD is probably going to wait until after the RTX 3000 series, RDNA2, and Zen 3 before announcing or even mentioning it.
It's likely an issue with mindshare, where they simply don't want us consumers to be bogged down with too much tech news all at once.
There's just nothing to talk about yet...
Yeah, that sort of thing goes beyond typical obsolescence and reaches into malicious behavior.
I'm currently talking more from the perspective of a person who doesn't quite view IBM to be guilty yet.
But it's the sort of thing where if the issue is that it's not mere obsolescence, and it's...
The dark fact is that not all older people will be able to adapt to new paradigms in computing and data usage. Merely being an engineer or corporate office worker isn't enough if the work you do simply doesn't bring profit to the company.
On the other hand, there is also an element of loyalty...