Intel ARC A770/A750 Driver Update, Intel Claims Big Performance Gains!

erek

[H]F Junkie
Joined
Dec 19, 2005
Messages
10,875
Some updates on performance with the latest drivers

"The Arc A750 is only a little slower than the A770 (and the competition) and has some other potential advantages over the 6650 XT, namely ray tracing support and upscaling performance. Intel's GPU graphics chip has dedicated units for accelerating every part of both technologies, especially the latter.

Granted, RT support is fairly pointless on these entry-level GPUs. The A750 averaged 81 fps at 1440p and 105 fps at 1080p in our testing, so halving those frame rates, or worse, isn't something anyone is going to be interested in.

To get the best upscaling performance with the A750 or A770, Intel's XeSS needs to be implemented in games, and this is where the Radeon RX 6650 XT has a firm lead, as far more games offer FSR than XeSS. For now, though, Intel appears committed to making Arc work, and driver support for new games has been good, so fingers crossed that this level of dedication continues."

Source:


Source: https://www.techspot.com/review/2634-intel-arc-a770-a750-revisit/
 
AMD hasn't figured out how to fix this in the past 10+ years and Intel fixes it in their first GPU iteration.
AMD made a 500W cpu just a few years back. customers chasing 10W for power draw while reading spreadsheets are not their customers. They use intel GPU's...
 
AMD made a 500W cpu just a few years back. customers chasing 10W for power draw while reading spreadsheets are not their customers. They use intel GPU's...
AMD cards currently use 90-100W while idling with different monitors. NV (and now Intel) cards use 20W. That's while not playing games which is where the majority of PCs spend their time....

It's a well known issue that has been happening for years. This video will get you started down the rabbit home but it's from before Intel fixed it on their end.

 
Last edited:
AMD cards currently use 90-100W while idling with different monitors. NV (and now Intel) cards use 20W. That's while not playing games which is where the majority of PCs spend their time....

It's a well known issue that has been happening for years. This video will get you started down the rabbit home but it's from before Intel fixed it on their end.


My 6700 XT is currently drawing 29W with 3 monitors active.

But I'm running Linux, of course.
 
AMD cards currently use 90-100W while idling with different monitors. NV (and now Intel) cards use 20W. That's while not playing games which is where the majority of PCs spend their time....

It's a well known issue that has been happening for years. This video will get you started down the rabbit home but it's from before Intel fixed it on their end.


Gpu power draw quits being a concern as soon as I confirm my PSU can supply it.

This is like complaining that your 900hp twin turbo Vette fuel economy is bad in town.
 
AMD cards currently use 90-100W while idling with different monitors. NV (and now Intel) cards use 20W. That's while not playing games which is where the majority of PCs spend their time....

It's a well known issue that has been happening for years. This video will get you started down the rabbit home but it's from before Intel fixed it on their end.


Mine uses 60w TBP with 3 monitors at different resolutions and refresh rates which is supposed to be about the worst case scenario.
 
Gpu power draw quits being a concern as soon as I confirm my PSU can supply it.

This is like complaining that your 900hp twin turbo Vette fuel economy is bad in town.
More like it gets 5 mpg in town and 10 mpg on the highway while the 900hp tt Viper gets 5 mpg in town and 30 mpg on the highway.

Other companies have figured it out. AMD can't. Why?
 
Mine uses 60w TBP with 3 monitors at different resolutions and refresh rates which is supposed to be about the worst case scenario.
It depends on the memory power usage on the card as it's related to the memory clocks. AMD cards keep high clocks on the memory when using multiple monitors (of different resolutions and refresh rates) and never go into low power mode.
 
6800xt using 25w with 2 1080p 24" and and a 27" 1440p/170hz...so yeah...not sure where the 90-100w comes from...
 
More like it gets 5 mpg in town and 10 mpg on the highway while the 900hp tt Viper gets 5 mpg in town and 30 mpg on the highway.

Other companies have figured it out. AMD can't. Why?
Fair, but if I'm buying a turbo Vette over a Viper, the last criteria I care about is fuel economy. It's irrelevant to my purchasing decision.
 
6800xt using 25w with 2 1080p 24" and and a 27" 1440p/170hz...so yeah...not sure where the 90-100w comes from...
my whole system uses 85w at idle.
I'm not sure where the 90-100w comes from either. I've never experienced it.
Just because some people are having that issue doesn't mean everyone is, we are evidence of that. But of course that doesn't stop some people from making blanket statements.
 
Back
Top