Separate names with a comma.
Discussion in 'HardForum Tech News' started by AlphaAtlas, Jan 9, 2019.
He got everything down except the price which has not been confirmed.
Actually, the price of the GPU was based on a Navi announcement, not on a Vega 2 w/16 GB of HBM memory.
So NVIDIA burned bridges with their pricing by bringing DLSS and Raytracing to the world but AMD is the hero for recycling a GPU from 2016 at the same MSRP? Some people man.
P.S. NVIDIA supports ASYNC so the freesync argument is dead now.
And praise god for freesync being dead now. Maybe gsync prices will finally fall.
Well I'm excited to see how DLSS runs, but I'm not interested in the real-time ray tracing until video cards get faster. I didn't enable 8X MSAA when it first came out because it was a resource hog; same goes for DXR. The best thing that NVIDIA has announced is the streaming upgrade with OBS. That would make me want to buy an NVIDIA card. If DLSS takes another year to show up in more than a handful of titles, then I know I made the right choice to skip those cards. If it makes a real difference in games that I play, and has widespread adoption by Fall, then I might be all in for an upgrade.
I'd rather have 16GB of memory over another resource hog that I never enable.
Just my opinion without having seen a review on the new AMD. Also I hope that NVIDIA can get more than 12 FreeSync monitors to work in the future through a driver update. That would be nice!
Why 16GB through? Wouldn't something like 8-12 be cheaper and still compete well? AMD mentioned content creation and gaming, could this just be a gaming+creation sku instead of creating another (what the fuck is the gpu name again? Radeon 7?) one with 8GB?
The thing you are ignoring is that there will be more and more games supporting either dxr, dlss or a combination of both as we move forward and these features will shine in games that aren't fast paced FPS (I always thought BF5 was a bad idea to show off DXR). NVIDIA has already demonstrated that DLSS in combination with DXR largely alleviates the performance cost of DXR alone (in BF5) so I expect more and more titles to utilize that in the future. Plus DLSS itself alone is a huge advancement for 4K gaming, you can't deny that and it will undoubtedly gain traction since it costs developers nothing and all the training is done by NVIDIA. And as you mentioned, there is NVENC improvements for OBS but there is also shadowplay game filters which for me as a gamer is a HUGE plus because now I don't have to worry about ReShade (which doesn't work in some games now like BO4). Now contrast that with AMD's $700 recycled card from 2016, what do they bring? Nothing.
Also for those that say raytracing won't get anywhere, it will since it's supported by DirectX, it may not be huge anytime soon but all new groundbreaking features take time. In the meantime however, those that have the hardware for it can still enjoy it.
Got any questions for AMD on the VII? I am writing up some questions now.
Would is be possible to have less games that have support for DLSS currently?
Also, am i only one sitting here wondering why AMD has decided to release this Vega 7nm Gaming card after stating their 7nm Vega was intended for HPC only? I wonder whats the implication behind this. We noted the use of the HPC card as a pipecleaner for 7nm GPU in 2020, and its great AMD is making a attempt to compete with Nvidia's RTX cards, but is it simply a RTX response or something more behind this?
I wonder how TSMC is doing, smartphone 7nm probably is slowing down, crypto hit a mine and sank, Apple, well we know hows thats going (lower demand and lowered revenue), perhaps their 7nm orders are freeing up for AMD.
Also, I wonder how the consoles are shaping up. We know Sony is targeting Navi. Probably Xbox will do the same. I wonder if Navi is chiplets. That would be pretty cool to see.
Looking back at Adoredtv's video... consumer 7nm GPU... info of the 16GB HBM was known and the price was estimated from the Red gaming tech piece.
So this Radeon 7 was a result of a concern that AMD had nothing to compete with NVIDIA. So they produced 7nm Vega. But after seeing not so impressive Turing results AMD was not so concerned and canceled 7nm Vega production. But they still produced 7nm Vega (Radeon 7). They have stock and they don't want to lose money on it, so February 7th... here it comes.
Also all this talk of RTX and ray tracing, big meh. can we reach 4k 60 fps first?
It is my opinion that Vega 7 is priced too high.
599 makes more sense when you factor in hardware features (regardless of available software or practicality)
Cancelled? I doubt it. Also isn't Turing still beating AMD anyway? Why wouldn't AMD try and present a product to compete there?
16GB of HBM2 is fucking eye-watering amounts of $$$.
I wanted to present this summary from Anandtech that offers some questions.
I can answer that one for you.
Navi is coming. And... You can always lower the price, you can't jack it up. And yes, I agree with you. I was truly thinking $649 would have resonated better overall.
Having watched the presentation and read the comments here and elsewhere, I'm honestly more psyched about the PS5. Not that I'll be buying one, mind you.
Let me know when CES does the presentation about the telepods. I'll be sure to have my fly swatter handy...
Then flip the 3 game bundle. Could esily net you $75+ and bring it closer to your $599 price point.
Can't sell the game bundles, they are tied to the UPC.
A bit underwhelmed by all of this. Not really interested in a Vega II refresh or that $700 dollar price tag, I'm sure a lot of people are though. That RTX 2060 for $350 is looking mighty tempting now.
I see that the Vega VII is a 16GB model, will their be 8GB or 12GB versions for less? Possibly the $499 range?
They have unproven support, AMD helped usher in Vulcan along with numerous other tech that are used a whole lot more tech. Ray tracing is baked into DX12, and as it stands is woefully underwhelming. DLSS is cool and all but not a feature worth the asking price.
When those features become widely used, along with a more robust support for FreeSync monitors (12 out of how many FreeSync monitors?), along with a reasonable asking price, and last but not least stability then Nvidia might get my money again.
The Radeon VII is $699, half way between the 2070 and 2080 MSRP.
LOL! The VII has 16GB of HBM2 with 1TB of bandwidth per second and is also doubles as a workstation type of card. As Kyle also said, you can always lower the price but you cannot jack it up. We do not even have a clue what if any overclocking capacity there will be. I would not touch a 2080 with a ten foot poll, since it can barely do 1080p the that DTX "feature", even with the free space invaders game thrown in.
I'd rather not have the bundle and the lower price tag
True, but the long development timeline means that you probably aren't going to be playing those games with the GPU you buy today. You'll be playing them on your next GPU
Hmm. Would that have to be an active adapter? If so, I wonder how much input lag it would add.
No idea, and I'm also not sure whether vrr works through an adapter from something like displayport to hdmi in the first place. Maybe that is not technically feasible. This is why we need faster adoption of new connector standards !!!!
What are you asking exactly?
Do we know this is not full Vega 20?
I saw an interesting chart on anandtech.
If that information is accurate, AMD finally doubled the ROPs from 64 to 128 (it's been stuck at 64 for high end cards since the r9 290 cards released all the way back around 2013).
But look at the stream processors and CUs, it's actually LESS than regular Vega 64, so it could easily be that this is a much smaller Vega die than the original, Not sure though. But dropping from 14nm to 7nm and only increasing transistor counts from 12.5 to 13.2 billions suggests to me they are working on a smaller die size than Vega 64 was, possibly much smaller that allows for more headroom in the future.
Hey Kyle, did they give you a card to review? Hope so, hope so, hope so........
The more I think about it, this demonstration is great, but I wonder if heat soaking the system by loading all the cores might be hiding any less than all core load turbo clock advantages Intel may have.
Would have been nice to see a single core Cinebench as well as the multi-core one.
One of the reasons Cinebench is not really a good benchmark for true performance. It simply runs too far and does not heat load the CPU. That is one of the reasons we have the longer encode workloads we use.
the fun thing is i bet its costing nv more for a 12nm wafer full of 2080/ti chips than its costing amd for a wafer of 7nm vega 20 chips not to mention the yields have to be better.
Kyle don't forget to ask about hdmi 2.1 support i will assume this is the going to be the card for 4k 144hz freesync2 hdr
I would expect retail cards to run 1700MHz to 1800MHz while gaming. Certainly this is just a guess at this time. But let's remember that AMD does its clock statements a bit differently than NVIDIA. 1800MHz Boost Clock is still "up to" 1800MHz.
That is because running it over and over does not heatload the CPU or the cooling attached to it, especially a custom loop system. But let's get back to on topic conversation please.
All those are already on the list.
Whatever they "saved" per chip they spent on process node advantage not to mention HBM. AMD isn't raking it in on these, not by a long shot.
All our questions are now in to AMD. They are being worked on as I type. Thanks for the feedback from you guys, many thanks!!! I think we did cover most if not all what was proposed here plus a lot of other topics. Of course all this has to be answered and then signed off on by the AMD hierarchy, so I am unsure on the timeline for publication. My flight back to Dallas leaves at 6am in the morning, so I will be back at the keyboard by late afternoon.
AMD's CEO Lisa Su confirms ray tracing GPU development
“I think ray tracing is an important technology, and it’s something we’re working on as well, both from a hardware and software standpoint,” Su said. “The most important thing, and that’s why we talk so much about the development community, is technology for technology’s sake is okay, but technology done together with partners who are fully engaged is really important"
"The consumer doesn’t see a lot of benefit today because the other parts of the ecosystem are not ready" Su added
smart move...content with Nvidia releasing these prototypes while biding their time until the market is ready for a true ray-tracing card