I don't understand why the 4070Ti is getting all the hate? If Tech Reviewers had any integrity and honesty they would have hated on the 4080 and the 7900 cards as well. They are all terrible for the price they are asking. These cards are all so badly priced that the 4090 looks like good value...
Truth? What truth? The GPU industry is where it's at because of mining. The 4090 and 7900XTX are in short supply because AMD and Nvidia are limiting production to sell off the huge amounts of stock left over from when the mining bubble burst. Nvidia even came out with a statement saying that...
That just means people aren't buying new cards, not that PC gaming is dying. The majority of PC Gamers don't even have 1050 class GPUs.
There are large numbers of PC Gamers still using integrated graphics. Any GPU released in the last 7/8 years is perfectly capable of playing every PC game...
I think you are the one who misunderstands a few things. They didn't add it to the displayport spec because of blah, blah, blah. They added it because AMD submitted a specification change request in November 2013. It was a change to a timing parameter amongst other things This was their final...
This is a false equivalency. Of course they were crucified for doing it, they were caught cheating to make their cards get higher FPS.
Second problem with your statement is that DLSS and FSR give far better results than the hacks they used back then. And they are improving with each...
This isn't completely correct. AMD had been working on Adaptive sync before Nvidia did their Gysnc demo in October 2013. AMD submitted their final proposal to VESA to add it to the Display port specification in November 2013. But they had the hardware needed for adaptive sync in their Hawaii...
What are you talking about? Show me any technical fact that you have stated? Show me one actual technical truth? Show me one piece of science in your posts that backs up your idiotic comments?
You really should go learn something. I mean, it's all out there. Have you never heard of...
Every time you say that no amount of tweaking changed anything further emphasises that you didn't know what the hell you were doing. You see, what you say is actually impossible. You do realise that? If you really are an IT professional then you would understand how foolish your comments are...
Saying that you are an IT professional and that you couldn't get a Nvidia card and an AMD card to look the same on the one Monitor is an oxymoron.
Back in the analog days this might have been true. Now in the digital age, GPUs and monitors have to meet certain requirements to get...
It still happened even after we went digital. Sometimes the AMD driver defaults to YCbCr 4:4:4, not as much recently. Sometimes it varies depending on the Monitor.
I will agree with you though, that nowadays, for the most part, there is no difference.
Funny you should mention Colorimeter's...
Yeah, you are right, Back in the day Nvidia had the digital vibrancy(basically Saturation) turned up a lot by default. They changed this default later to a much more neutral setting.
No, I used to have customers like you, who swore blind that AMD's colours were better. That the Nvidia card they switched over to looked worse. And they said the exact same thing as you. No amount of tweaking would make the display on the Nvidia card look the same. You would try to talk them...