Been using a 4k 40" display for some time with a custom wide screen res, 3840x1620. This is about 25% less pixels than full UHD, so the performance gain vs UHD is useful. Problem is, after extended use (about a year), their is slight burn-in at the edges of the black bars.
If AMD raster = Nvidia, if power draw roughly same, if price within $50 of Nvidia, if RT better with Nvidia, I think majority of buyers will go Nvidia. Reviews will show RT performance in games and that will be a determining factor. Ppl are swayed by benchmarks. +/- $50 wont make much difference...
More important to me is 4k raster performance. RT not important for me at this stage, though with next gen cards (Hopper, RDNA 3) things should look more interesting with more RT titles and more capable HW to run RT @ 4k.
Techpowerup because they test the most games and of course for their relative performance charts.
Guru3d since been a member there for long time and the reviewer (Hilbert) uses a FLIR (thermal imaging camera) for temps which can detect weaknesses in GPU cooling designs.
Anandtech but they test...
RGB FULL. Even though thats 4:4:4, it will be 8-bit since need HDMI 2.1 to get above that. Still better than other options at this stage until HDMI 2.1 arrives.
Two subcontractors responsible for the Founders Edition manufacturing are now under investigation: Foxconn and BYD (Build Your Dreams). It is also said that even NVIDIA product and sales managers have not seen the design yet, which also may or may not be true, but if it is then it just proves...
Some speculation that there will not be a 3080ti (initially at least). That the 3080 will be the top card and based on a GA102 chip. Sounds logical to me since Nvidia had always done that with the Ti since Kepler (release it several months after a xx80). Only Turing bucked the trend.
Never had a card that bothered me noise-wise over last decade. There may be a few hot and loud cards this may apply to, but doubt the 2070 is one of them.
Best way to resolve your issues is to have a secondary display, preferably 1080p since that is a standard res compatible with AV receivers. Plug your GPU directly into your main monitor (with DP), then plug the HDMI from GPU directly to receiver which is connected to your second display. The...
This was a unique case involving an inferior architecture, Kepler, which has not kept up with the times. To use it as a general all encompassing example for your argument is misleading really. Try comparing other Nvidia vs AMD cards, ie, 980ti vs FuryX or GTX 1080 vs Vega 64 (release date...
Nvidia brightness, contrast, gamma settings do not effect the actual monitor hardware. Meaning if you have monitor settings at full brightness and drawing 80w power, but reduce brightness from Nvidia controls, it will still draw 80w power. Adjusting these settings directly from the monitor is...
yep, my bad. meant to say the 8800gt was 256bit (as opposed to 320-bit of the earlier cards), but it came with 512mb vram as well as the 256mb.
I've owned the 8800gts 640mb from when it was released and in most benches for the cards of that period, the 320mb was mostly equal with the 640mb (@...
320mb at the time (2007) was above average, certainly more than anything ATI had at the time. Plus it had a 320-bit bus. The 8800gt had even less vram!! 256mb! I guess you werent old enough to remember cards from that era.
This was the most demanding game at the time. Look at the ATI cards in...
How convenient of you not to mention the 8800GTS 320 & 640mb versions which were released at same time as the GTX. Those were reasonably priced and offered almost twice the performance as the previous gen and which ATI had nothing at that performance level at the time. It wasnt the 8800GTX that...
The most satisfying GPU upgrades is when you go to one that is massively more powerful (at least 3 times or more powerful in this case). So yeah I can imagine how stoked you are with the new card.
Every card that I've bought was my favorite of all time.... until the next one that came out. Therefore my current card (RTX 2080) is my favorite of all time.
Not a Navi owner, long term Nvidia user. As bad as things may seem in the tech media, I really dont see this is as affecting more than a minority of people. Newegg and Amazon user reviews roughly average about 4 out of 5 stars in customer satisfaction. So I think most users are generally...
Techpowerup because it benches the most games (often over 20) in their GPU reviews. Also they give performance summaries per resolutions, per watt, per dollar, etc. AT is good as well as Guru3d.
There is also the potential mistake where one can assume poor performance or stutters is due to lower vram when it may be other factors at play. Some mid-range or weaker cards can run out of GPU power before the lesser vram has an impact. A good way to be sure is to test identical cards with 4gb...
That chart of vram usage in games looks like it was done on an 8gb card. If a GPU has more vram, some games may allocate more (not need, just allocate). A point I thought necessary to make since many people dont know the difference between need and allocate.
Pretty silly. The test was commissioned by AMD and was a brief snapshot of 6 vs 6 GPUs within a 12 day period using one desktop driver and one workstation driver from each side. AMD certainly would have been observant of various driver behaviors within that limited time and felt confident that...
70 page thread at AMD forum re persistent Navi driver issues. Hope they get it sorted out :D.
https://community.amd.com/thread/243837?messageTarget=all&start=1725&mode=comments
CRTs inferior for reading text, not as sharp as LCDs. Also too small, max 24" and are bulky heavy, not to mention power hungry. And good luck finding one in good working order that hasnt deteriorated over time (discoloration, phosphor decay, etc).
Yes true... just kind of sad that an almost 3 year old card is still being compared to AMDs latest. Where Navi does very well (ie, Vulkan), there are other instances where its still behind the 1080ti. Consistency seems more the issue.
Poor Ampere.
Without the mining factor inflating prices, but purely on its merits as a gaming card. The 1080ti is unique in that it has not depreciated much well after the mining boom was over.
I wanted to buy one last year but waited until the RTX cards were out. Bought a 2080 because it was slightly...
Not sure if this thread was posted as intended. A 3 1/2 year old last gen Nvidia card (gtx1080) not performing in RT as good as latest AMD card 5700xt? Is that it??
In some games, the 5700xt will have an edge. Namely BF5 and RDR2 where the 1080ti lags behind and the 5700xt may indeed be smoother. Have not heard of any other reports, games where one card is "smoother" than the other. Tbh, would be a tough decision to choose one over the other. Newer games...
I received an email from a banker in Nigeria wanting to give me 40% of $8,500,000 for posing as beneficiary for deceased account holder. Do you think its fake?
Here something you can try. Of course depends on the inputs available on your monitor. TVs are more flexible in this regard as they have multiple HDMI inputs.
Your monitor is likely to have 1 HDMI and 1 DP input. Plug your GPU directly into the monitor with a DP cable. This will take care of...