Everyone has different standards on what qualifies as value. The question is always how sour or bitter you want to be when it doesn't meet your standard.
For each person that thinks the 2080 Ti is a ripoff, someone else is certain it's great value.
Personally, as a user I think it's bad value...
While I think the pricing is crazy, what is crazier to me is how similar RTX is to RX(you know from that other brand?).
For a brand conscious company like Nvidia, I'm not sure if this is a foobar.
Not sure what to think. So Steam has been overcounting because of Asian cybercafes.
As a result, we see that more people actually use Intel GPUs? I'm sorry but I'm having a WTF moment about the NA/EU audience.
My personal opinion is that visually we are reaching plateau with 1080p. Most development studios have reached a point of "good enough" for graphics and as far as hitting 60fps, I think any midranged card will easily reach that today.
The future of more expensive graphics cards is either 4K...
Still wish we could turn back time. Mantle, DX12, Vulkan, all these "close to the metal" APIs should never have existed. It's like everyone forgot why DX3D and OpenGL existed in the first place.
Algrim’s got it right. When dealing with color accuracy, your eye is merely your perception of that color. You want to be sure your display setup is accurate, you get real tools.
Puple or gold dress?
No, any monitor can be calibrated. All you need is a spectrometer and calibration software. Spyder, look it up.
If you don’t calibrate, then as I said, all you have stated is a preference for AMD’s rendition.
I'm sorry, but you have just shown how you really don't understand anything about color.
All you have shown is a preference for AMD's rendition and dislike Nvidia's.
Anyone who understands color accuracy and knows why monitor calibration is always required will then understand why color...
While the selfish side of me agrees with the sentiment that graphics cards are getting stupidly expensive, has anyone looked at DDR4 and mobile SOC prices? Global demand for phones is driving up production of all mobile parts and fab capacity is mostly bought up by those making mobile parts...
This thread is hysterical. Can an architect never design another building again? Can a fashion designer never design another outfit again? Can a games developer never make another game again? So can a CPU designer never make another CPU again?
Jim Keller's a talent and everyone recognizes it. A...
I would definitely like to echo this opinion.
I always thought PC gaming was all about no-holds-barred performance. Pinnacle of performance where possible. Compromise is the land of consoles, not PC.
Yet the review seems to be trying to satisfy itself at 60fps. While I get that buying say a...
I completely understand what you said. What I'm saying is I don't think what you are saying is very much substantiated.
OpenGL, Nvidia's drivers were the gold standard for the longest time. Try using ATI drivers with OpenGL if you want to know grief. And I'm saying this in the nicest possible...
I asked which open standard does Nvidia not support? Arguably Freesync but that's not something every other GPU maker supports. I'll check with Matrox.
So unless there's a list of open standards which Nvidia doesn't support, I don't see evidence that Nvidia won't support open standards unless...
*sigh* I'm totally at risk of sounding like a fanboy today, given what I have posted today in a couple of threads.
I have to say you are being very unfair to Nvidia here. Can you list all the open standards that they don't support?
With CUDA, they developed the API because there were no...
Doubt all you like but it's not like AMD did not try. They just executed so poorly that they achieved none of their monopolistic goals.
First, what was Mantle? A set of APIs developed by AMD to allow users to code to the metal, in particular AMD's metal. Can Mantle code run on anything other...
I'm very certain the problem isn't that Vega is not competitive. The problem is that Nvidia delivered that performance 1 year before Vega and everyone bought that performance 1 year before Vega was released.
You....haven't dealt with a games publisher yet.
May I also add that if anyone has dealt with devrel with both AMD and Nvidia, will pretty much know both are equally evil. Trying to paint one company as angelic over the other is just hypocritical.
I thought I shall chime in about HDR and what my own workplace actually experienced.
A few months ago, the developers and artists were taking a serious look into HDR and decided to create a test scene using the Unity engine. After taking a few scenes from one of our shipping games and redid it...
I'll look at JPR's figures. They will definitely be interesting.
As for Vega causing marketshare to plunge, I'm still very skeptical about it. I still think Steam's hardware survey results are not definitive due to China's numbers skewing the results. I am certain what we are seeing is Nvidia's...
I don't think AMD's market share plunged due to Vega.
If you look at the Steam Hardware Survey results from history, you will notice this trend.
August 2017
English 38.56% -1.97%
Simplified Chinese 21.63% +4.99%
Nvidia 67.62%
AMD 18.63%
Sep 2017
English 34.64% -3.92%
Simplified Chinese...
If you have been following the Hardocp news front page, this piece of news came up.
https://www.hardocp.com/news/2018/02/02/windows_10_finally_surpasses_7_market_share
http://gs.statcounter.com/windows-version-market-share/desktop/worldwide/#monthly-201701-201801
And in Steam's hardware...
I do agree that regional breakdowns are important. Our marketing guys got burnt by that already. For example, console penetration numbers look awesome for Europe and North America but insignificant in Asia. Windows XP/7/8/10 is another factor. Lastly localization is also very important.
In my...
I had a quick peek at my studio's marketing data and it looks like India won't topple China anytime soon. They may be similar population-wise but a far greater proportion of China is online than India is.
India rivals China in terms of mobile usage though.
And I have to cheekily agree with...
Nailed it.
DX12 is a total hit and miss, and a lot more miss because most developers simply aren't proficient enough with DX12 to be able to optimize code like the experts in the industry(eg. iD, Epic). We struggle to even render correctly and truth be told, graphics is starting to no longer...
It didn't matter a single bit when real engineers actually posted here. God knows how many times I have been dismissed with my DX12 views despite me working with it everyday.
It's just a sign of the modern times. Real news or facts don't matter if it doesn't fit my viewpoint.
This guy gets it.
If anything, graphical fidelity is no longer a priority like a few years ago. We have reached a point where it's good enough. Call us lazy or complacent but it genuinely is at a point where it's just diminishing returns on effort.
While I agree fanboyism is embarassing, I do strongly believe that people should have some degree of passion for technology. Passion for technology is what drives engineers, scientists and innovators to want to create new stuff and make things work.
Everyone should be championing advancement...
There is a cost to developing higher end graphics features and most developers aren't willing to bite at the cost. Games development is expensive enough that we struggle to even make the game. When Nvidia is willing to come in and basically do stuff for us at their cost(ie Nvidia developer...
I didn't miss that point. All I am pointing out is that inflation exists. It's financially unrealistic to expect the top end to remain the same price as the top end of 20 years ago, just like I expect my Big Mac to still cost the same 20 years ago but not want my salary to stay the same as 20...
Firstly, do you think that the top end's price will never increase? Inflation doesn't exist? My 1.99 Big Mac is still around?
Secondly, if you are so concerned with a certain price point, why not compare all the releases at that price point and determine if there's any performance gain to be...
Sometimes, it almost feels like AMD has given up and they are just having a party to see what shit sticks.
Like #BetterRed, what kind of a stupid hashtag is that. Drop 1 r as a typo and they have been #bettered.
Yet do you understand where the cost/benefit to developers for jumping into DX12 and not having DX11 is?
DX12 = Win10 only. DX11 = Win7/8/10. Which beancounter do you know will do what you wish?
So many people I work with already predict DX12 is the new DX10.
But as to why DX12 was pushed out, we believe it's less to do with Win10 adoption, but rather to push for UWP adoption. A unified development for the PC and Xbox, which I think Microsoft wants developers to assume is the easy path...