AMD Fury series coming soon.

4K still isn't ready for prime time honestly, the lack of real hardware and connection options along with buggy drivers makes this at least 2 years away from being mainstream.

I just upgraded from 1200p to 1440p so i'm happy enough for awhile.
 
Who are gonna be the lucky 10 or so people to grab the Fury X stock at launch. I know I wont be able to, everything here in Australia just seems to be out of stock all the time.
 
Drivers have a lot more to do with it then the game engine, game engines aren't smart that way, when close to the memory wall, @1900mb used the 680 shouldn't stutter.

I don't think so, considering cards that had more than 2GB used more than 2GB, it would be logical to conclude that the game can use more than 2GB, emphasis on "can", not "needs" . and a game engine has a lot more logic in it, than drivers do. saying they aren't smart that way is naive.
 
I don't think so, considering cards that had more than 2GB used more than 2GB, it would be logical to conclude that the game can use more than 2GB, emphasis on "can", not "needs" . and a game engine has a lot more logic in it, than drivers do. saying they aren't smart that way is naive.


Game engines don't have that logic, just dl UE4 or Unity if you don't believe me, they will tell you if settings are to high and frame rates are going low that's about it. I can push a game engine to as much memory as I want to of course developing on it would be a pain.

Yes the game engine tells the drivers what it needs but the drivers can and will allocate based on system restrictions up a certain point. If its running out of vram it can clear out a buffer before it creates another, engines don't have that luxury, actually they don't need it cause its up to the developer to tell what a level in an engine needs.
 
Game engines don't have that logic, just dl UE4 or Unity if you don't believe me, they will tell you if settings are to high and frame rates are going low that's about it. I can push a game engine to as much memory as I want to of course developing on it would be a pain.

Yes the game engine tells the drivers what it needs but the drivers can and will allocate based on system restrictions up a certain point. If its running out of vram it can clear out a buffer before it creates another, engines don't have that luxury, actually they don't need it cause its up to the developer to tell what a level in an engine needs.

OK then the game does... still , it has been done 3 years ago, why wouldn't it be in use now? Just because UE4 Demo doesn't do it doesn't mean it can't and is not being done elsewhere.
 
OK then the game does... still , it has been done 3 years ago, why wouldn't it be in use now? Just because UE4 Demo doesn't do it doesn't mean it can't and is not being done elsewhere.


What UE4 demo, you can dl the engine and make a level yourself and see what happens. It is probably the most advanced graphic engine when it comes to features right now or close to the Cry Engine.

I know there is a whole thing about different engines and what they can do, but I can tell from experience its not the engines themselves its the devs on the team that can make engines scream.
 
Last edited:
What UE4 demo, you can dl the engine and make a level yourself and see what happens. It is probably the most advanced graphic engine when it comes to features right now or close to the Cry Engine.

I know there is a whole thing about different engines and what they can do, but I can tell from experience its not the engines themselves its the devs on the team that can make engines scream.

Right, so, are you a game developer? are you experienced in software development at all? if so how much of the engine did you take advantage of? Do you know for a fact, that the game engine has no control over it? are you sure it's 100% driver controlled?
If you cannot answer those questions with certainty, I'll stick to my observation :p how did NV get around it? 680s did not stutter, they used less than 2GB, 7970s used closer to 2500mb in the same games, with the same settings, if it was only drivers, the 680 would have , at some point stuttered as textures were being loaded from system memory. In that case, is FrostBite more advanced than UE4 :p?
 
Right, so, are you a game developer? are you experienced in software development at all? if so how much of the engine did you take advantage of? Do you know for a fact, that the game engine has no control over it? are you sure it's 100% driver controlled?
If you cannot answer those questions with certainty, I'll stick to my observation :p how did NV get around it? 680s did not stutter, they used less than 2GB, 7970s used closer to 2500mb in the same games, with the same settings, if it was only drivers, the 680 would have , at some point stuttered as textures were being loaded from system memory. In that case, is FrostBite more advanced than UE4 :p?

IF you can't answer all


Yes I am, I am part owner of an independent game company, outside of my full time job as a sr. producer for one of the three major television stations in US and their movies, specifically doing special effects, that is using the UE4 engine. I do various tasks from programming, to scripting to 3d assets and production work.

Right now we are still in planning stages as assets and certain special effects get done, but specs that we have looked into, we have looked into in depth.

As of right now the demo we have been creating uses more then 6 GB of memory for the 680 to run well we have to drop texture detail and game details down 1 notch, (1 mip map level, which effectively cuts our texture memory usage by 1/4), mind you this is just a demo/ cinematic, the game will have many more assets on the screen.

I explained to you that lets say a game uses more then 2 gb, just a little over, the drivers can be modified to say clear a buffer out before starting another buffer, this will drop vram usage under 2 gb.
 
Yes I am, I am part owner of an independent game company, outside of my full time job as a sr. producer for one of the three major television stations in US and their movies, specifically doing special effects, that is using the UE4 engine. I do various tasks from programming, to scripting to 3d assets and production work.

Right now we are still in planning stages as assets and certain special effects get done, but specs that we have looked into, we have looked into in depth.

As of right now the demo we have been creating uses more then 6 GB of memory for the 680 to run well we have to drop texture detail and game details down 1 notch, (1 mip map level, which effectively cuts our texture memory usage by 1/4), mind you this is just a demo/ cinematic, the game will have many more assets on the screen.

I explained to you that lets say a game uses more then 2 gb, just a little over, the drivers can be modified to say clear a buffer out before starting another buffer, this will drop vram usage under 2 gb.

5-600MB over is not "just over" it's 25% :p
And you only answered the first 2!(at least with some certainty)

I have to ask though, who uses UE4 for special effects in television?
 
5-600MB over is not "just over" it's 25% :p
And you only answered the first 2!(at least with some certainty)

I have to ask though, who uses UE4 for special effects in television?


UE4 isn't for my full time job that's just for something I'm doing on the side if it takes off by by full time job lol.

There are a lot of things that can be done to drop memory usage with drivers, I just gave one example. Now I really can't go into too much detail because driver coding isn't my thing, best if you ask that at B3D or msdev, they would be much better suited for a better answer.

Remember back when the x1900xt had a memory allocation bug that when AMD fixed that? On a per game basis this happens quite often, this is one of the reasons why so many drivers are released.

Oh didn't have time before to respond to the frost bite engine. Its a good enigne, but I don't see where its better then Cry or UE4, engines now a day are flexible enough to do just about any type of game just depends on how good the devs are.

edit: they all have their limitations which vary from one another, but its not like before. Many of the 'cool" features in frostbite can be done with blueprints in UE4, so programming at all necessary.
 
https://www.youtube.com/watch?v=BWbRSHtHI6c

Richard Huddy talks about HDMI 2.0, Fury, 4 GB, and 4-way CrossFire.

tl;dr HDMI 2.0 was a time-to-market decision and they are pushing the active adapters.
4-way CrossFire is supported
Fury sounds more like a full Fiji XT based on their response: "the fury x is the name that we've attached to that set up", NO INFO until we get closer to mid-July
Can't talk about 8 GB models yet.
 
Last edited:
https://www.youtube.com/watch?v=BWbRSHtHI6c

Richard Huddy talks about HDMI 2.0, Fury, 4 GB, and 4-way CrossFire.

tl;dr HDMI 2.0 was a time-to-market decision and they are pushing the active adapters.
4-way CrossFire is supported
Fury sounds more like a full Fiji XT based on their response. NO INFO until we get closer to mid-July
Can't talk about 8 GB models yet.

Also they explicitly decided on the current output setup for the best possible gaming experience.
 
Well he is Chief bullshit evangelist for AMD so what else was he going to say ?
Did you watch the video or are you just being intentionally confrontational?
He seemed pretty upfront about it.

I don't agree with their "active adapter later this summer" solution, but he didn't make excuses.
 
HDMI 2.0 was a time-to-market decision and they are pushing the active adapters.
Also they explicitly decided on the current output setup for the best possible gaming experience.
What could possibly be better with Fury's output configuration than with their classic 1 DL-DVI + 5 mDP on the ultra-enthusiast cards?

DP 1.2 to HDMI 2.0 adapters are just now popping up on AliExpress and they are not exactly cheap. Maybe because AMD and their partners have cleaned out the entire Shenzen area of these adapters in order to put them in Fury boxes? :rolleyes:
 
tl;dr HDMI 2.0 was a time-to-market decision and they are pushing the active adapters.

Nvidia released HDMI 2.0 cards in September 2014, nine months ago, and it's been reported that GM204 was taped out five months before that, so Nvidia would have had HDMI 2.0 in the works since May 2014 - which makes sense, since the HDMI 2.0 spec was released in September 2013.

And that's not enough time for AMD?
 
Nvidia released HDMI 2.0 cards in September 2014, nine months ago, and it's been reported that GM204 was taped out five months before that, so Nvidia would have had HDMI 2.0 in the works since May 2014 - which makes sense, since the HDMI 2.0 spec was released in September 2013.

And that's not enough time for AMD?

HDMI is the g-sync of output sources...

Fk HDMI and the 'costs' associated with it... http://www.hdmi.org/manufacturer/terms.aspx

DP is FREE.

I for one am glad that AMD is choosing to leave off HDMI2.0.

True needers of that technology, the 'enthusiasts', may curb their purchases on 4k TVs to only buying ones with DP, which will give more incentive for TV manufactures to ALWAYS have DP...

Eventually, HDMI will hopefully phase itself out for being a POS thats playing catch-up and costing money...
 
Nvidia released HDMI 2.0 cards in September 2014, nine months ago, and it's been reported that GM204 was taped out five months before that, so Nvidia would have had HDMI 2.0 in the works since May 2014 - which makes sense, since the HDMI 2.0 spec was released in September 2013.

And that's not enough time for AMD?

And what exactly has been the benefit of HDMI 2.0 on those cards? Yeah you can say you have it and wave the Nvidia pride flag but who actually is using it?

HDMI 2.0 the most worthless thing since _______. (fill in the blank)
 
What's your PayPal address I have an invoice to send you to replace my $2000 receiver since AMD claimed this was the 4k living room solution
 
And what exactly has been the benefit of HDMI 2.0 on those cards? Yeah you can say you have it and wave the Nvidia pride flag but who actually is using it?

HDMI 2.0 the most worthless thing since _______. (fill in the blank)

Several hundred posts in the display forum with guys using 4k Samsung displays that are HDMI only. It's the defacto high end enthusiast 4k display, actually.
 
Usually people that are living room oriented have an xbone and kinect. Those don't cost $2k LOL :p
 
Several hundred posts in the display forum with guys using 4k Samsung displays that are HDMI only. It's the defacto high end enthusiast 4k display, actually.

And in that thread how many actually bought the TV? 10 people? I don't want to start trolling but the HDMI 2.0 thing is pretty funny.

Earlier you said you were going to end up getting the 980ti's how are they running, hope you got the after market ones.
 
HDMI is the g-sync of output sources...

Fk HDMI and the 'costs' associated with it... http://www.hdmi.org/manufacturer/terms.aspx

DP is FREE.

I for one am glad that AMD is choosing to leave off HDMI2.0.

True needers of that technology, the 'enthusiasts', may curb their purchases on 4k TVs to only buying ones with DP, which will give more incentive for TV manufactures to ALWAYS have DP...

Eventually, HDMI will hopefully phase itself out for being a POS thats playing catch-up and costing money...

It doesn't matter what you like.

You have to confirm to standards. For example the U.S. uses miles vs. kilometers that are SI units. Also a bunch of other measurements are non-SI. But it doesn't mean that the manufacturers targeting the U.S. audience somehow adhere to SI units only.

You have to cater for all market conditions especially if you are launching a new architecture that gave you plenty of time to include the appropriate spec.
 
For me it's a matter of the TV manufacturers need to adapt to modern times and realize that Display Port is the future and that they need to implement it on TVs.

HDMI is good on 1080p panels but that is yesterdays tech and what we see now with HDMI 2.0 is more like artificial life support IMO and there's a need to transition to new and better tech in order to keep moving forward.
 
HDMI is the g-sync of output sources...

Fk HDMI and the 'costs' associated with it... http://www.hdmi.org/manufacturer/terms.aspx

DP is FREE.

I for one am glad that AMD is choosing to leave off HDMI2.0.

True needers of that technology, the 'enthusiasts', may curb their purchases on 4k TVs to only buying ones with DP, which will give more incentive for TV manufactures to ALWAYS have DP...

Eventually, HDMI will hopefully phase itself out for being a POS thats playing catch-up and costing money...

DP is cheaper to license in your products but is not free.

Some of us like to watch movies and play games from our HTPCs and receivers and TVs are totally lacking DP. Even if you find a TV with DP, I’ve not seen any receivers with DP.

As they stand right now I wish everything was DP. It seems like it more about trying to keep people from buying cheaper TV and using them on PCs.
 
And in that thread how many actually bought the TV? 10 people? I don't want to start trolling but the HDMI 2.0 thing is pretty funny.

Earlier you said you were going to end up getting the 980ti's how are they running, hope you got the after market ones.

More than 10.

980 Ti are great. Got reference, overclocking to 1500 core stock volts and noise is minimal. Going to watercool
 
Nvidia released HDMI 2.0 cards in September 2014, nine months ago, and it's been reported that GM204 was taped out five months before that, so Nvidia would have had HDMI 2.0 in the works since May 2014 - which makes sense, since the HDMI 2.0 spec was released in September 2013.

And that's not enough time for AMD?

HDMI 2.0 is only present on the 980ti.

That is the only card using the correct sil chip.
 
For me it's a matter of the TV manufacturers need to adapt to modern times and realize that Display Port is the future and that they need to implement it on TVs.

HDMI is good on 1080p panels but that is yesterdays tech and what we see now with HDMI 2.0 is more like artificial life support IMO and there's a need to transition to new and better tech in order to keep moving forward.

What do you mean by artificial support? It's all about bandwidth and conductivity. You can create a connector of any shape but it has to be able to support enough bandwidth for your targeted resolution. As it turns out HDMI 2.0 is perfectly capable of doing just that.
 
DP is cheaper to license in your products but is not free.
orly.

Q. What are the user benefits of DisplayPort?
A. DisplayPort provides several direct and indirect benefits to the user. Direct benefits include higher performance capability, the availability of display adapters for legacy display types, and the ability to connect multiple displays to a single video output. Indirect benefits include smaller system form factor and lower system cost, because DisplayPort enables higher system integration, requires less RF shielding, and is royalty free. DisplayPort also uses a small connector, or can be combined with other interfaces onto a single common connector. As the only AV interface with link training, a more robust and stable link is established.
 
Back
Top