Watch AMD's CES Keynote Here

I think going that "chiplet" style design separating the cpu from the IO stuff, AMD could open up their own extra market...why the fk would they not.

That is, be able to deball the IO or the core chip from the interposer (whatever you want to call that base pad)
then you would be able to keep the "socket" up to date while also being able to update the cores if needed, this can promote AM4 and sockets to come truly can be the first evolving socket, chiplet are probably one of the best ways to make such happen.hell, why not make the "chip" be a modern version of a cartridge style where the board comes with the base socket and IO attached and you "slot in" the chip you will use completing the socket, slap on the provided cooler etc.

Either way at the very least it ensures even tighter control on security, enhancments, firmware etc by decoupling them from each other....I shudder to think memory pricing when DDR5 comes out, DDR4 was "supposedly" much cheaper than 3 to produce, but, most of the memory makers have slowed production down to keep pricing up, DDR5 could allow 4x as much yield at higher speeds etc but this would not matter much if the makers charge 2-3x current pricing, not overly liking the higher CL on DDR5 (from what I have seen)

That is neither here nor there however, pci-e 4.0 might be quite interesting, not that GPU need the extra speed BUT that is not saying something comes that truly can leverage it like NMVE drives or whatever, here is hoping the new boards are going to pay even closer attention to port placement, connectivity, smart placement of fan headers and so forth, and for the love of god, lets make sure they have enogh features such as Bclck external generator, a solid enough VRM for the mobo, nothing worse than the board cooking itself because they cheap out on heatsink/style etc
 
I’d take a Vega VII over a 2080 any day now. Considering I’ve used nothing but Nvidia for the longest time up until recently, they burned their bridges with their high pricing, and failure rates. Plus Freesync support, the potential to squeeze more performance from 7nm, along with more VRAM.

I think even with a $700 price tag it might make Nvidia a little uncomfortable considering the only step up they have from that costs roughly 50% more for less than 50% gain in performance. Also, if they come out with a cheaper variant that competes with the 2070 at a few hundred less I think it’ll bring competition back to the market.
 
Where there is Radeon 7, there is also Radeon 5 and Radeon 3? and maybe Radeon 9?

no they killed the R5 R7 etc branding after they tried it with the radeon 300 series, it didn't end up working the way they wanted it too.

but i'm really hoping they re-think the radeon 7 shit and go with radeon vega II.
 
I’d take a Vega VII over a 2080 any day now. Considering I’ve used nothing but Nvidia for the longest time up until recently, they burned their bridges with their high pricing, and failure rates. Plus Freesync support, the potential to squeeze more performance from 7nm, along with more VRAM.

I think even with a $700 price tag it might make Nvidia a little uncomfortable considering the only step up they have from that costs roughly 50% more for less than 50% gain in performance. Also, if they come out with a cheaper variant that competes with the 2070 at a few hundred less I think it’ll bring competition back to the market.

So NVIDIA burned bridges with their pricing by bringing DLSS and Raytracing to the world but AMD is the hero for recycling a GPU from 2016 at the same MSRP? Some people man.

P.S. NVIDIA supports ASYNC so the freesync argument is dead now.
 
So NVIDIA burned bridges with their pricing by bringing DLSS and Raytracing to the world but AMD is the hero for recycling a GPU from 2016 at the same MSRP? Some people man.

P.S. NVIDIA supports ASYNC so the freesync argument is dead now.

And praise god for freesync being dead now. Maybe gsync prices will finally fall.
 
So NVIDIA burned bridges with their pricing by bringing DLSS and Raytracing to the world but AMD is the hero for recycling a GPU from 2016 at the same MSRP? Some people man.

P.S. NVIDIA supports ASYNC now so the freesync argument is dead now.
Well I'm excited to see how DLSS runs, but I'm not interested in the real-time ray tracing until video cards get faster. I didn't enable 8X MSAA when it first came out because it was a resource hog; same goes for DXR. The best thing that NVIDIA has announced is the streaming upgrade with OBS. That would make me want to buy an NVIDIA card. If DLSS takes another year to show up in more than a handful of titles, then I know I made the right choice to skip those cards. If it makes a real difference in games that I play, and has widespread adoption by Fall, then I might be all in for an upgrade.

I'd rather have 16GB of memory over another resource hog that I never enable.

Just my opinion without having seen a review on the new AMD. Also I hope that NVIDIA can get more than 12 FreeSync monitors to work in the future through a driver update. That would be nice!
 
Why 16GB through? Wouldn't something like 8-12 be cheaper and still compete well? AMD mentioned content creation and gaming, could this just be a gaming+creation sku instead of creating another (what the fuck is the gpu name again? Radeon 7?) one with 8GB?
 
Well I'm excited to see how DLSS runs, but I'm not interested in the real-time ray tracing until video cards get faster. I didn't enable 8X MSAA when it first came out because it was a resource hog; same goes for DXR. The best thing that NVIDIA has announced is the streaming upgrade with OBS. That would make me want to buy an NVIDIA card. If DLSS takes another year to show up in more than a handful of titles, then I know I made the right choice to skip those cards. If it makes a real difference in games that I play, and has widespread adoption by Fall, then I might be all in for an upgrade.

I'd rather have 16GB of memory over another resource hog that I never enable.

Just my opinion without having seen a review on the new AMD. Also I hope that NVIDIA can get more than 12 FreeSync monitors to work in the future through a driver update. That would be nice!

The thing you are ignoring is that there will be more and more games supporting either dxr, dlss or a combination of both as we move forward and these features will shine in games that aren't fast paced FPS (I always thought BF5 was a bad idea to show off DXR). NVIDIA has already demonstrated that DLSS in combination with DXR largely alleviates the performance cost of DXR alone (in BF5) so I expect more and more titles to utilize that in the future. Plus DLSS itself alone is a huge advancement for 4K gaming, you can't deny that and it will undoubtedly gain traction since it costs developers nothing and all the training is done by NVIDIA. And as you mentioned, there is NVENC improvements for OBS but there is also shadowplay game filters which for me as a gamer is a HUGE plus because now I don't have to worry about ReShade (which doesn't work in some games now like BO4). Now contrast that with AMD's $700 recycled card from 2016, what do they bring? Nothing.

Also for those that say raytracing won't get anywhere, it will since it's supported by DirectX, it may not be huge anytime soon but all new groundbreaking features take time. In the meantime however, those that have the hardware for it can still enjoy it.
 
Got any questions for AMD on the VII? I am writing up some questions now.
 
The thing you are ignoring is that there will be more and more games supporting either dxr, dlss or a combination of both as we move forward and these features will shine in games that aren't fast paced FPS (I always thought BF5 was a bad idea to show off DXR).
Would is be possible to have less games that have support for DLSS currently?
 
Also, am i only one sitting here wondering why AMD has decided to release this Vega 7nm Gaming card after stating their 7nm Vega was intended for HPC only? I wonder whats the implication behind this. We noted the use of the HPC card as a pipecleaner for 7nm GPU in 2020, and its great AMD is making a attempt to compete with Nvidia's RTX cards, but is it simply a RTX response or something more behind this?

I wonder how TSMC is doing, smartphone 7nm probably is slowing down, crypto hit a mine and sank, Apple, well we know hows thats going (lower demand and lowered revenue), perhaps their 7nm orders are freeing up for AMD.

Also, I wonder how the consoles are shaping up. We know Sony is targeting Navi. Probably Xbox will do the same. I wonder if Navi is chiplets. That would be pretty cool to see.
 
Looking back at Adoredtv's video... consumer 7nm GPU... info of the 16GB HBM was known and the price was estimated from the Red gaming tech piece.





So this Radeon 7 was a result of a concern that AMD had nothing to compete with NVIDIA. So they produced 7nm Vega. But after seeing not so impressive Turing results AMD was not so concerned and canceled 7nm Vega production. But they still produced 7nm Vega (Radeon 7). They have stock and they don't want to lose money on it, so February 7th... here it comes.
 
It is my opinion that Vega 7 is priced too high.

599 makes more sense when you factor in hardware features (regardless of available software or practicality)
 
Looking back at Adoredtv's video... consumer 7nm GPU... info of the 16GB HBM was known and the price was estimated from the Red gaming tech piece.





So this Radeon 7 was a result of a concern that AMD had nothing to compete with NVIDIA. So they produced 7nm Vega. But after seeing not so impressive Turing results AMD was not so concerned and canceled 7nm Vega production. But they still produced 7nm Vega (Radeon 7). They have stock and they don't want to lose money on it, so February 7th... here it comes.


Cancelled? I doubt it. Also isn't Turing still beating AMD anyway? Why wouldn't AMD try and present a product to compete there?
 
Yes, where's navi? But seriously, why the $699 pricing? $599 it would have had a unique place in the market.

16GB of HBM2 is fucking eye-watering amounts of $$$.

I wanted to present this summary from Anandtech that offers some questions.

So that AMD is able to sell what are admittedly defective/recovered Vega 20s in a $699 card, produce enough of them to meet market demand, and still turn a profit on all of this is a surprising outcome. I simply would not have expected AMD to get a 7nm chip out at consumer prices this soon. All I can say is that either AMD has pulled off a very interesting incident of consumer misdirection, or the competitive landscape has evolved slowly enough that Vega 20 is viable where it otherwise wouldn’t have been. Or perhaps it’s a case of both.

Shifting gears for a second, while I’ve focused on gaming thus far, it should be noted that AMD is going after the content creation market with the Radeon VII as well. This is still a Radeon card and not a Radeon Pro card, but as we’ve seen before, AMD has been able to make a successful market out of offering such cards with only a basic level of software developer support. In this case AMD is expecting performance gains similar to the gaming side, with performance improving the more a workload is pixel or memory bandwidth bound.

Wrapping things up, the Radeon VII will be hitting the streets on February 7th for $699. At this point AMD has not announced anything about board partners doing custom designs, so it looks like this is going to be a pure reference card launch. As always, stay tuned and we should know a bit more information as we get closer to the video card’s launch date.
 
Yes, where's navi? But seriously, why the $699 pricing? $599 it would have had a unique place in the market.
I can answer that one for you.

Navi is coming. And... You can always lower the price, you can't jack it up. And yes, I agree with you. I was truly thinking $649 would have resonated better overall.
 
Having watched the presentation and read the comments here and elsewhere, I'm honestly more psyched about the PS5. Not that I'll be buying one, mind you.

Let me know when CES does the presentation about the telepods. I'll be sure to have my fly swatter handy...
 
It is my opinion that Vega 7 is priced too high.

599 makes more sense when you factor in hardware features (regardless of available software or practicality)

Then flip the 3 game bundle. Could esily net you $75+ and bring it closer to your $599 price point.
 
  • Like
Reactions: Rahh
like this
A bit underwhelmed by all of this. Not really interested in a Vega II refresh or that $700 dollar price tag, I'm sure a lot of people are though. That RTX 2060 for $350 is looking mighty tempting now.


I see that the Vega VII is a 16GB model, will their be 8GB or 12GB versions for less? Possibly the $499 range?
 
So NVIDIA burned bridges with their pricing by bringing DLSS and Raytracing to the world but AMD is the hero for recycling a GPU from 2016 at the same MSRP? Some people man.

P.S. NVIDIA supports ASYNC so the freesync argument is dead now.

They have unproven support, AMD helped usher in Vulcan along with numerous other tech that are used a whole lot more tech. Ray tracing is baked into DX12, and as it stands is woefully underwhelming. DLSS is cool and all but not a feature worth the asking price.

When those features become widely used, along with a more robust support for FreeSync monitors (12 out of how many FreeSync monitors?), along with a reasonable asking price, and last but not least stability then Nvidia might get my money again.
 
So NVIDIA burned bridges with their pricing by bringing DLSS and Raytracing to the world but AMD is the hero for recycling a GPU from 2016 at the same MSRP? Some people man.

P.S. NVIDIA supports ASYNC so the freesync argument is dead now.

LOL! The VII has 16GB of HBM2 with 1TB of bandwidth per second and is also doubles as a workstation type of card. As Kyle also said, you can always lower the price but you cannot jack it up. We do not even have a clue what if any overclocking capacity there will be. I would not touch a 2080 with a ten foot poll, since it can barely do 1080p the that DTX "feature", even with the free space invaders game thrown in. :D
 
I'm pretty sure I already know the answer, but will the radeon 7 have hdmi 2.1? And if not, can amd release a displayport to hdmi 2.1 adapter for use with those lg oleds this year?

Hmm. Would that have to be an active adapter? If so, I wonder how much input lag it would add.
 
Hmm. Would that have to be an active adapter? If so, I wonder how much input lag it would add.

No idea, and I'm also not sure whether vrr works through an adapter from something like displayport to hdmi in the first place. Maybe that is not technically feasible. This is why we need faster adoption of new connector standards !!!!
 
Will there a full vega 20 gaming card?

Do we know this is not full Vega 20?

I saw an interesting chart on anandtech.

https://www.anandtech.com/show/13832/amd-radeon-vii-high-end-7nm-february-7th-for-699

2n8LB1p.jpg



If that information is accurate, AMD finally doubled the ROPs from 64 to 128 (it's been stuck at 64 for high end cards since the r9 290 cards released all the way back around 2013).

But look at the stream processors and CUs, it's actually LESS than regular Vega 64, so it could easily be that this is a much smaller Vega die than the original, Not sure though. But dropping from 14nm to 7nm and only increasing transistor counts from 12.5 to 13.2 billions suggests to me they are working on a smaller die size than Vega 64 was, possibly much smaller that allows for more headroom in the future.
 
Hey Kyle, did they give you a card to review? Hope so, hope so, hope so........ :D
 
"How does Ryzen really perform compared to the competition?"

3rd gen ryzen in Cinebench 8C16T Ryzen, not final frequency, early sample vs stock frequencies of Intel 9900k.

Ryzen 2057, Intel 2040.

Marginal win, but still a win. Same core and thread count, so it looks like they have finally caught up in IPC.

7nm also allows much lower power use.

This is what I came here for!

The more I think about it, this demonstration is great, but I wonder if heat soaking the system by loading all the cores might be hiding any less than all core load turbo clock advantages Intel may have.

Would have been nice to see a single core Cinebench as well as the multi-core one.
 
This is
The more I think about it, this demonstration is great, but I wonder if heat soaking the system by loading all the cores might be hiding any less than all core load turbo clock advantages Intel may have.

Would have been nice to see a single core Cinebench as well as the multi-core one.
One of the reasons Cinebench is not really a good benchmark for true performance. It simply runs too far and does not heat load the CPU. That is one of the reasons we have the longer encode workloads we use.
 
the fun thing is i bet its costing nv more for a 12nm wafer full of 2080/ti chips than its costing amd for a wafer of 7nm vega 20 chips not to mention the yields have to be better.
 
Kyle don't forget to ask about hdmi 2.1 support i will assume this is the going to be the card for 4k 144hz freesync2 hdr
 
Do we know this is not full Vega 20?

I saw an interesting chart on anandtech.

https://www.anandtech.com/show/13832/amd-radeon-vii-high-end-7nm-february-7th-for-699

View attachment 134004


If that information is accurate, AMD finally doubled the ROPs from 64 to 128 (it's been stuck at 64 for high end cards since the r9 290 cards released all the way back around 2013).

But look at the stream processors and CUs, it's actually LESS than regular Vega 64, so it could easily be that this is a much smaller Vega die than the original, Not sure though. But dropping from 14nm to 7nm and only increasing transistor counts from 12.5 to 13.2 billions suggests to me they are working on a smaller die size than Vega 64 was, possibly much smaller that allows for more headroom in the future.
I would expect retail cards to run 1700MHz to 1800MHz while gaming. Certainly this is just a guess at this time. But let's remember that AMD does its clock statements a bit differently than NVIDIA. 1800MHz Boost Clock is still "up to" 1800MHz.
 
Have you actually seen this problem in your testing? I have tried to test for heat soaking by running it over and over again in rapid succession, and seeing if performance declines, and I have had almost identical results over and over, but with more modern CPU's that are running closer to the thermal envelope and have more aggressive turbo profiles this may make more of a difference.
That is because running it over and over does not heatload the CPU or the cooling attached to it, especially a custom loop system. But let's get back to on topic conversation please.
 
First obvious question I have is will there be lesser VRAM SKUs from AIBs? 8GB etc... If so, and price is cheaper, I can see that being compelling.

CrossFire still capable, and software supported?

What is AMD's opinion on ray tracing in games and its opinion on NVIDIA's implementation currently?
All those are already on the list. :)
 
Back
Top