https://www.cnbc.com/amp/2024/02/14/nvidia-passes-alphabet-market-cap-now-third-most-valuable-us-firm.html
Guys… big green is closing in on Microsoft and Apple. This is nuts!
Nvidia’s stock is up over 250% from just 1 year ago. Intel and AMD have been left for dead.
AI makes Nvidia a lot of money. It’s an emerging market, and for the moment, Nvidia is leading the way. I don’t think they’ve abandoned or don’t like gamers, but it is all but guaranteed that Nvidia is throwing most of their capital at AI development and research.
I’ve started experimenting with this monitor and different input sources. I recently tried an Apple TV 4K (Gen 3) on it, and holy hell it looks good. Ironically my old Samsung Q90R also looks better on the Apple TV 4K.
This tells me something; it’s not the monitor that’s the issue… it’s the...
Obviously Apple does not sell their silicon, but Apple commands a decent market share in the PC space, and their chips are quite amazing for what they can do in a very small power envelope. Intel needs to be really careful. ARM chip development is no joke.
Depending on the game, DLSS @ Quality can lead to a slight increase in visual quality along with a small performance bump. It really depends on which version of DLSS is being used and how it’s implemented in a game, though. In general, if you like the framerate with DLSS off, then leave it off.
Ampere (RTX 3000) was a really good architecture held back by the Samsung 8N process. The reason we saw such an amazing uplift going from the 3000 to 4000 GPUs was mainly due to the process improvement. It allowed Nvidia to push their products much further without incurring the power penalty...
Cheap GPUs sell very well. My first GPU when I was a kid was a Geforce 2 MX400. It was $70 brand new. Is there a brand new GPU currently on the market that can get even remotely close to that price?
Really surprised the Core 2 Duo E6600 and/or Core 2 Quad Q6600 aren’t on this list. Destroyed Netburst, and overclocked like stink. They were epic CPUs.
Not really sad. It’s still a new tech. Back when things like ambient occlusion, tessellation, complex shadowing, programmable shaders, AA/AF, etc were implemented, it took years to become mainstream… and it took even longer to become easy to render. We take those items for granted now, but they...
I have a very large case, and the 4090 is already pushing it heavily. Unless Nvidia is willing to create an external box that connects to the PCIe slot through the back of my PC, I think we're at the limit of how large we can make a GPU. I sort of equate this to Smartphone size; smartphones kept...
I'm sure it's been said in the past 19 pages of this thread, but this has not aged well.
Also, I can go ahead and say that the 7800x3D WILL NOT be an utter failure of a CPU. Know why? Because I have its predecessor, the 5800x3D, and it is a godly CPU for gaming... and the 7800x3D is better in...
People are actually saying that Ryzen CPUs are bad? What? The Ryzen 1000 CPUs were AMD’s return to the market.
Ryzen 3000: AMD caught Intel
Ryzen 5000: AMD beat Intel (briefly)
Ryzen 7000: AMD is literally at parity while using significantly less power.
Ryzen is one of the BEST CPU lines. It’s...
Core 2 Duo E6600. Intel took the disastrous Prescott Pentium 4 and made the absolutely epic Conroe architecture which reduced power consumption while at least doubling IPC. That chip ran so cool that I could run my Thermalright Ultra 120 without a fan directly attached to it. Zero chance you...
Either Prescott Pentium 4s or Bulldozer. The Prescott Pentium 4 was a crazy power hog and unable to match AMD Athlon 64 and X2 in most areas.
Bulldozer was the same thing for AMD. It could not match Intel's Core i5/i7, and in some cases, Bulldozer was worse than the previous Phenom II chips...
This is beyond terrifying if you think about it. One moment, we have Siri and Google Assistant which are able to give us search results, call people, save calendar appointments, and do other somewhat useful but innocuous things to assist us. Now, we have AI that is able to communicate with us in...
This is a pretty big issue IMO, as the LG 27”OLED monitor supports a 4K120 HDR input natively. ASUS has no excuse except that they cut corners to save a buck.
It is an absolute travesty that the monitor only has HDMI 2.0b ports. The LG version of this monitor has two HDMI 2.1 ports, so all 3 ports (2 HDMI, 1 DP) can do 1440p/240hz. Also, the LG 27" monitor can supports a 4K120 HDR input natively (downscaled to 1440p), which means you can play consoles...
This morning, I was ready to take it back because I didn't like the fact that it was not 4K and thought the panel was too expensive. Someone on another forum pointed out that the panel has MLA, which I did not realize and is really cool. So I set it back up and started playing with the settings...
Ok, just got mine a little over an hour ago and have been playing with it since then, so these are my initial impressions.
Build Quality: Thin, metal, everything feels expensive. Build quality is exceptional. Good job LG.
Brightness: It's low compared to other displays, but does it bother...
Soooooo I have recently moved, and my current living situation means that the LG CX 55" which I have been using for nearly 3 years and over 15k hours on-time no longer makes sense. Got a new "toy" coming tomorrow.