AMD's New Graphics Engineer Pledges Yearly GPU Releases to Make PCs Fun Again

Nice, we'll see if that actually happens. I'm still waiting on the yearly episodic releases of the HL franchise. :(
 
I could care less if they release a card every year or every two years. We need performance that is in the ball park of Nvidia at prices that will make you say “im going to buy AMD this time”. Ryzen is a perfect example of this. Is the Intel faster for gaming than AMD? Sure it is, but Ryzen is in the ball park and that alone has made people jump ship from Intel. The marketing requires no smoke and mirrors. Year after year, AMD’s marketing team has had a monumental task of selling a shit flavored popsicle to everyone and making people believe that its not. Just freaking build a fast GPU AMD and keep pricing aggressive. Its so simple
 
That's all great but AMD won't even give us mobile Vega 8 owners a new driver. I for one won't ever touch another AMD product at least Nvidia supports their products....
 
I think a new graphics card ever would be a good start. Baby steps and all that.
 
Someone needs to release a new card, I want to upgrade my 750ti as it is finally getting bogged down. I'm not paying the current prices for old cards.
 
That's all great but AMD won't even give us mobile Vega 8 owners a new driver. I for one won't ever touch another AMD product at least Nvidia supports their products....

Long history of lack of mobile card support on the AMD side... Back in 2008-9 AMD\ATI refused to put a single driver update out for their flagship 4870 x2 mobility product -- over its entire lifetime. Can you imagine? Not a single update for a $700 card. One of the biggest FUs to customers I've ever seen in 30 years of building PCs.

Really hope the competition comes back as graphics consumers are presently in a really tough space.
 
Last edited:
I just hope this "new GPU every year" is not just rebranding.
 
I just hope this "new GPU every year" is not just rebranding.
I really don't think it will be.. i think AMD Ryzen strategy comes directly from Lisa Su and she is moving as agressive as possible there, and she seems to have unleashed engineers and have successfully kept them ambitious and agressive. It would seem the GPU side had a good amount of problems to work out, but I believe she will do the same as Ryzen. The 7nm Instinct seems like a very real product that will be delivered on the proper time frame.. they will work from there on im sure. Remember too, the GPU team also has a shit load of work too, consoles, mobile, maybe even phones again.. maybe... A lot of work.
 
I could care less if they release a card every year or every two years. We need performance that is in the ball park of Nvidia at prices that will make you say “im going to buy AMD this time”. Ryzen is a perfect example of this. Is the Intel faster for gaming than AMD? Sure it is, but Ryzen is in the ball park and that alone has made people jump ship from Intel. The marketing requires no smoke and mirrors. Year after year, AMD’s marketing team has had a monumental task of selling a shit flavored popsicle to everyone and making people believe that its not. Just freaking build a fast GPU AMD and keep pricing aggressive. Its so simple

Sell shit popsicles for years and no one bats an eye. Demo an unreleased product at 5GHz using a custom cooler and everyone loses their mind! :) I'm seeing a lot of excuses when it comes to AMD's deception over the years.
 
I really don't think it will be.. i think AMD Ryzen strategy comes directly from Lisa Su and she is moving as agressive as possible there, and she seems to have unleashed engineers and have successfully kept them ambitious and agressive. It would seem the GPU side had a good amount of problems to work out, but I believe she will do the same as Ryzen. The 7nm Instinct seems like a very real product that will be delivered on the proper time frame.. they will work from there on im sure. Remember too, the GPU team also has a shit load of work too, consoles, mobile, maybe even phones again.. maybe... A lot of work.



Its good that AMD is shooting for AI and HPC, etc. But they have to do a lot better to catch up. Not only on the hardware side, but AMD has NONE of the software and tools that nvidia offers. It doesn't matter if Instinct is twice as fast as Volta if it doesn't have a CUDA or TensorRT equivalents.
 
I'm going to preface this by saying that I'm a Team Green fanboy here, and thus my viewpoint is from the nVidia card side of things. It can be argued that 60 FPS for a game is a minimum for a smooth running game, although higher FPS is better PROVIDED you have a monitor that supports that higher FPS. And, according to the May, 2018 Steam Hardware Survey, 60.49% of Steam users who participated in the survey have 1920x1080 single monitors (no max FPS information available). As far as I can see, the 1050Ti is a fairly good performer at the 1080p resolutions for the popular games such as Fortnite, Player Unknown Battlegrounds, CS:GO, and League Of Legends. Granted, you may have to turn down some of the eye candy in the game, but it is workable. 8.06% of users have a 1050Ti card, while 11.89% of users have a 1060 card, and are the #2 and #1 ranked cards. Obviously, these folks aren't [H]ardGamers.
 
I, for one, trust the Wang.

After watching that Intel engineer interview from the other thread I have no doubt things are about to get fun again. Hell, they already are.
 
Some companies are better at being number 2s...

silvio-the-sopranos3.jpg
 
Great, we already put up with software that regurgitates itself with sequels every year. Now we have to put up with it from hardware manufacturers doing the same thing. If they really think this is a good thing that people want there dumbasses. They seriously can't be thinking that's good can they?!

it's nothing new.. just look at nvidia with the 8 and 9 series, GTX 400 and 500 series and then don't forget about the 10-12 month Ti refreshes in the 600, 700, 900, and 1000 series. welcome to the last 10 years of GPU history..

I'm going to preface this by saying that I'm a Team Green fanboy here, and thus my viewpoint is from the nVidia card side of things. It can be argued that 60 FPS for a game is a minimum for a smooth running game, although higher FPS is better PROVIDED you have a monitor that supports that higher FPS. And, according to the May, 2018 Steam Hardware Survey, 60.49% of Steam users who participated in the survey have 1920x1080 single monitors (no max FPS information available). As far as I can see, the 1050Ti is a fairly good performer at the 1080p resolutions for the popular games such as Fortnite, Player Unknown Battlegrounds, CS:GO, and League Of Legends. Granted, you may have to turn down some of the eye candy in the game, but it is workable. 8.06% of users have a 1050Ti card, while 11.89% of users have a 1060 card, and are the #2 and #1 ranked cards. Obviously, these folks aren't [H]ardGamers.

cost > caring if i have msaa enabled.. 10 years ago that was a completely different story, now on the other hand i haven't seen a AAA game release in a long time that's been worth shelling out 400+ dollars for a gpu and still don't see any future releases worth buying.
 
Last edited:
10 years ago that was a completely different story, now on the other hand i haven't seen a AAA game release in a long time that's been worth shelling out 400+ dollars for a gpu and still don't see any future releases worth buying.

Any particular good AAA game releases in your book in the past few years? I find it hard to pay $60 when I have a backlog of unplayed games and, by waiting a year or so, I can get the game for less than $20. Another year or so after that, and it's even lower.

Here I am, rocking my single 980 which I got in November, 2014. Yes, I paid a pretty penny for it, and at the time, I had a 1080p 60Hz monitor. Now, I have a 2K G-Sync monitor, and even in Overwatch, I'm getting 90 FPS. So, while upgrading that card is "nice to have", I'm in no particular hurry to upgrade it, although I will admit that I have been eyeing the 1180TI (or 2080Ti) whenever it gets released (probably late-2019).
 
Any particular good AAA game releases in your book in the past few years? I find it hard to pay $60 when I have a backlog of unplayed games and, by waiting a year or so, I can get the game for less than $20. Another year or so after that, and it's even lower.

Here I am, rocking my single 980 which I got in November, 2014. Yes, I paid a pretty penny for it, and at the time, I had a 1080p 60Hz monitor. Now, I have a 2K G-Sync monitor, and even in Overwatch, I'm getting 90 FPS. So, while upgrading that card is "nice to have", I'm in no particular hurry to upgrade it, although I will admit that I have been eyeing the 1180TI (or 2080Ti) whenever it gets released (probably late-2019).

I still rock a 9790 at 1.1GHz. I'm also running 2K. And while the 9790 getting a little long in the tooth, there is zero reason for me to upgrade except for VR. And I'm waiting for next gen which will surely be better at VR. The Top end of VR now is 1080ti and Titan. Well they barely play adequately with today's batch of games. With next gen games and higher res headsets, they will be quickly obsoleted to the ranks of barely playable RX580/1060

I hate NVIDIA's business practices. But if they offer the only product that can muscle next gen VR, then so be it.


And I agree. Competition is dead and miners are creating a bad market for gamers. A middle of the crop RX580/1060 will cost you $300. That's what my very top end Sapphire 9790 cost me 4 years ago. I have to triple that cost to get top end now.
 
Last edited by a moderator:
it's nothing new.. just look at nvidia with the 8 and 9 series, GTX 400 and 500 series and then don't forget about the 10-12 month Ti refreshes in the 600, 700, 900, and 1000 series. welcome to the last 10 years of GPU history

Each side has been doing that for the most part. There hasn't been a new nVidia card out for how long now? Not every year as far as I know.
 
Yea with Nvidia announcing a delay.. I think they are waiting on AMD to release and effectively releasing on their production laurels. As soon as AMD comes out with a next gen and announces it Nvidia will hit the PR market hard with their next gen card. I think their goal is to smash AMD.

I kind of hope this bites them in the ass and AMD simple comes out with something so good Nvidia has to loose the crown as performance leader. Then the real trick will be AMD keeping up with demand for their product.
 
Each side has been doing that for the most part. There hasn't been a new nVidia card out for how long now? Not every year as far as I know.

yes and no, both have done similar things but where it differs is that AMD has pretty much released their entire line up within 6 months of the initial release. where things really got muddy is with the 7k series where they pulled a nvidia 9 series wtf moment and re-branded the 7k series to the RX series because Polaris wasn't ready..

GTX 1080 released may 2016, GTX 1050ti released Oct 2016 GTX 1080ti released march 2017, GTX 1070ti released Nov 2017..
 
Yea with Nvidia announcing a delay.. I think they are waiting on AMD to release and effectively releasing on their production laurels. As soon as AMD comes out with a next gen and announces it Nvidia will hit the PR market hard with their next gen card. I think their goal is to smash AMD.

I kind of hope this bites them in the ass and AMD simple comes out with something so good Nvidia has to loose the crown as performance leader. Then the real trick will be AMD keeping up with demand for their product.
Except Nvidia didn't announce a delay. They announced that they're not interested in freezing sales with an announcement. There would be no upside to blabbing "pretty soon".

It will happen last minute as usual.
 
I am assuming that is sarcasm as you can't possibly be that ignorant.

Wasn't sarcasm. 4k is not benefitting anything on most common monitor sizes and places a huge load on the GPU -> you double or triple the cost for near zero benefit.
 
Wasn't sarcasm. 4k is not benefitting anything on most common monitor sizes and places a huge load on the GPU -> you double or triple the cost for near zero benefit.
Oh so ignorance it is then. Saying people need a 75 inch monitor for 4k to be worth it in gaming is beyond fucking stupid. It must be nice to be as oblivious and blind as you are though as you can save some money.
 
Last edited:
Wasn't sarcasm. 4k is not benefitting anything on most common monitor sizes and places a huge load on the GPU -> you double or triple the cost for near zero benefit.

Ehhhh... there are benefits other than raw Pixel count definition. I would agree from a pixel density standpoint a 32 inch display at 4k and a 32 inch display at 1440p don't have much visual difference. (And with shadow play you are already sacrificing the Resolution performance.)

Where 4k makes and 1440p over 1080p as well.. a HUGE difference is color saturation and definition to the eye. Sure you can't tell a line difference perhaps, but what really shows up is when you have effectively 4 times the light coming at your eyes it is easier to see the definition you otherwise may miss. Details pop more, things stand out more, and I would dare say it is easier on the eye as long as everything scales appropriately.

So yes from a DPI standpoint you don't see much difference depending on your distance, but that is made up in a myriad of other ways that are not just DPI. (Though are directly contributed to by that.)
 
Even on a 23 inch screen the difference is fucking huge in some games. For example at 1080p, Dishonored 2 has some horrible aliasing in some spots that even flickers a bit but going to 4k DSR nearly eliminates it and the difference is night and day especially in motion. Even the first Crysis has horrible jaggies around the tress at 1080p on my 23 inch screen but going to 4k DSR is again a night and day difference. There are just tons of games where 4k is clearly better looking than 1080p or even 1440p even on a small screen. To say we need a 75 inch screen to see the difference in games is a joke. Anyone claiming that is too stupid to even waste time on trying to convince otherwise though.
 
I dont expect Nvidia to lose any performance crown at any point in at least two years.,.. I think we are going to have heated competition in mid, low, and mobile... I don't know how nvidia is going to deal with their affinity for high prices though.
 
I dont expect Nvidia to lose any performance crown at any point in at least two years.,.. I think we are going to have heated competition in mid, low, and mobile... I don't know how nvidia is going to deal with their affinity for high prices though.

I don't expect it either, but hat would surely through a kink in their pants if/when it does happen. (Unless they just choose to let it happen to generate market churn after investing in their direct competitor heavily. You know... like Microsoft did with Apple.)
 
Any news other than we have a 1080 ti killer at equal to or less than it's current actual can-buy-it-regularly retail price isn't very interesting. They really need to be on top again, been playing catch up for too long.

Nvidia will 'kill' the 1080Ti with a new product before AMD does...

...and then AMD might catch up to the 1080Ti.

Wouldn't it be nice to be wrong?
 
Ehhhh... there are benefits other than raw Pixel count definition. I would agree from a pixel density standpoint a 32 inch display at 4k and a 32 inch display at 1440p don't have much visual difference. (And with shadow play you are already sacrificing the Resolution performance.)

Where 4k makes and 1440p over 1080p as well.. a HUGE difference is color saturation and definition to the eye. Sure you can't tell a line difference perhaps, but what really shows up is when you have effectively 4 times the light coming at your eyes it is easier to see the definition you otherwise may miss. Details pop more, things stand out more, and I would dare say it is easier on the eye as long as everything scales appropriately.

So yes from a DPI standpoint you don't see much difference depending on your distance, but that is made up in a myriad of other ways that are not just DPI. (Though are directly contributed to by that.)
It's official, I've heard it all. o_O

*Poster is not responsible for any eye damage from this message.
 
Nvidia will 'kill' the 1080Ti with a new product before AMD does...

...and then AMD might catch up to the 1080Ti.

Wouldn't it be nice to be wrong?
It would be nice. I've been using nothing but nvidia gpus since my 4400 ti, I really want some healthy competition.
 
I'm going to preface this by saying that I'm a Team Green fanboy here, and thus my viewpoint is from the nVidia card side of things. It can be argued that 60 FPS for a game is a minimum for a smooth running game, although higher FPS is better PROVIDED you have a monitor that supports that higher FPS. And, according to the May, 2018 Steam Hardware Survey, 60.49% of Steam users who participated in the survey have 1920x1080 single monitors (no max FPS information available). As far as I can see, the 1050Ti is a fairly good performer at the 1080p resolutions for the popular games such as Fortnite, Player Unknown Battlegrounds, CS:GO, and League Of Legends. Granted, you may have to turn down some of the eye candy in the game, but it is workable. 8.06% of users have a 1050Ti card, while 11.89% of users have a 1060 card, and are the #2 and #1 ranked cards. Obviously, these folks aren't [H]ardGamers.
Higher FPS is nothing without higher averages and minimums that you really notice.
 
8.06% of users have a 1050Ti card, while 11.89% of users have a 1060 card, and are the #2 and #1 ranked cards. Obviously, these folks aren't [H]ardGamers.

I don't want to be part of that group if someone with a 1060 obviously isn't "worthy" or whatever is going on.
 
It's official, I've heard it all. o_O

*Poster is not responsible for any eye damage from this message.

Also if you have a 120hz screen you double the amount of light that gets to your eyes, so its 8 TIMES!!! and if you increse the brightness to 100% you could double it again and maybe burn your retins, but hey its will be worth it. :ROFLMAO::D:D:ROFLMAO:
 
Also if you have a 120hz screen you double the amount of light that gets to your eyes, so its 8 TIMES!!! and if you increse the brightness to 100% you could double it again and maybe burn your retins, but hey its will be worth it. :ROFLMAO::D:D:ROFLMAO:

LOL ok you jackholes. I made one guesstimate and you are all having fun with this. When each pixel in effect is it's own light output there is more energy being projected toward you because of this.
 
So basically we can expect a lot more money grubbing shit GPU releases...
 
AFAIK 4k itself doesn't do jack shit to your color or brightness, they just have a late model panel. I wouldn't get a 4K gaming monitor myself, just too much GPU overhead and by that, cost.

Unless you have an OLED screen a pixel does not put out light by itself at all. So higher pixel density does not equal higher brightness.
 
Back
Top