AMD announces Ryzen 7000 Zen 4 CPUs

Ah, so you run it in portrait, letterboxed?

Yup although technically this is pillarboxing. You get an effective 2160x1620 resolution. Emulators like MAME have a -ror or -rol switch that rotates the screen left and right. You also have the side benefit of essentially getting actual vertical shmups etc. at their "native" rotation for free. It's a pretty solid solution. Interesting trivia: if you use the Windows built in desktop rotation feature, it actually doesn't work with G-Sync. Good job Nvidia/Microsoft. You have to rely on software's own rotation features.

unknown.png


Actual native 3:4 vertical game:

unknown.png
 
Last edited:
I'd buy it.

In just about everything and anything. I'll pay more money for quality/reliability/sturdiness but I will always opt for th eproduct with the lest number of "features" as they always cause problems.

There ought to be enough of us that there is at least a market for these products.

But it is true. In general the consumer is the worst enemy of good product design. The consumer/customer is the lowest common denominator.
Back in the day the Abit boards were known for this. They were light on features, but fast and pretty reliable.
 
Interesting trivia: if you use the Windows built in desktop rotation feature, it actually doesn't work with G-Sync. Good job Nvidia/Microsoft. You have to rely on software's own rotation

I had no idea about this. I often see people talk about turning BGR screens upside down and rotating the desktop in orger to get RGB ClearType., but if that breaks VRR, that's a non-starter.
 
This illustrates it very well. See how the width of a rotated 42" 16:9 monitor is almost the same as the width a non-rotated 4:3 25" CRT.

http://www.displaywars.com/42-inch-d{9x16}-vs-25-inch-4x3
My 32" LCD used vertically, approximates a 4:3 20" CRT width in my cabinet. Works very well as it also doubles as a pinball machine with marquee/backboard on the second horizontal monitor. So much of the Golden Age games run vertical and IMO look pathetic on a horizontal 16:9.
 
Welcome to why I’ve had a GeForce 710 for years. One GPU I can move around as necessary but generally hangs out in the box not drawing idle power and not wasting transistors on my CPU. The extra die space on the CPU could, if nothing else, be better used for cache.
Are they really missing anything here though? They aren't making it an APU, it sounded like a very basic iGPU and I think is part of the I/O die, so are you really missing anything here considering the I/O feature set?

I guess we can agree to disagree.
 
Welcome to why I’ve had a GeForce 710 for years. One GPU I can move around as necessary but generally hangs out in the box not drawing idle power and not wasting transistors on my CPU. The extra die space on the CPU could, if nothing else, be better used for cache.
It’s on the IO die, completely separate from the CPU, it’s presence doesn’t detract at all from the processor. Depending on how AMD leverages it, it could be a benefit, there are a number of IO operations that could benefit from GPU acceleration that it could handle independently and ahead of the CPU cores.

I just hope their new IO die is more than just a node shrink and a GPU addition. The algorithms AMD have been using for memory access and deadlock detection are lacking and I would love to see some improvements there.
 
Are they really missing anything here though? They aren't making it an APU, it sounded like a very basic iGPU and I think is part of the I/O die, so are you really missing anything here considering the I/O feature set?

I guess we can agree to disagree.
It’s a feature I’ll literally never use, and I’ll disable as far as possible to save power. I’m simply going to pair a high end processor with a high end video card. Hopefully this new iGPU is on its own power plane so it can be completely powered down.

Given my best possible case scenario is that it can be completely disabled, removal or nearly any other use of the transistors would be a better use. Put some DRAM on the io die or something if nothing else.

I’m glad some people will get some use out of it “just in case” or something, but it’s just an utter waste to me.
 
Again if they didn't included doesn't mean you would get more CPU functions. Having a igpu doesn't hurt anyone in anyway. I normally don't have a spare GPU and a igpu is always helpful for trouble shooting.
 
Again if they didn't included doesn't mean you would get more CPU functions. Having a igpu doesn't hurt anyone in anyway. I normally don't have a spare GPU and a igpu is always helpful for trouble shooting.
I would say it’s detrimental if it uses any power when I have a discreet card installed.

Again:
It’s a feature I’ll literally never use. It’s dead transistor space *at best *. At worst, it’s dead transistor space that uses power and increases heat. Nearly any other use for that transistor count would be better.
 
I would say it’s detrimental if it uses any power when I have a discreet card installed.

Again:
It’s a feature I’ll literally never use. It’s dead transistor space *at best *. At worst, it’s dead transistor space that uses power and increases heat. Nearly any other use for that transistor count would be better.
Unless the BIOS/UEFI and/or APU architecture are crap, the iGPU should be able to be fully disabled in BIOS/UEFI.
The HD 8400E iGPU is disabled in the thin client in my sig and uses no power or resources, and that is an SoC that is nearing a decade old at this point, so hopefully designs from 2022 will have that feature enabled.

I do agree that it is a waste of space, but these designs aren't going to change mainstream models for a tiny percentage of customers who want that specific feature removed at the hardware level.
There might be a top-tier model that will be released later on without it, but that will be $$$$ and a limited design at best.
 
This will be the first ryzen cpu with an igp, they obviously decided it was a good decision to include. I’m sure it will use less power than your rgb bling or a single fan that nobody seems to concern themself with regarding power consumption.
 
I would say it’s detrimental if it uses any power when I have a discreet card installed.

Again:
It’s a feature I’ll literally never use. It’s dead transistor space *at best *. At worst, it’s dead transistor space that uses power and increases heat. Nearly any other use for that transistor count would be better.
The new IO and igp likely use less power and generate less heat than the previous IO controller alone.

I’m personally hoping they leave it on even when a discrete GPU is installed, it would be nice to have discord and the likes running from it on a cheap 22”, while the gaming is dedicated to the 34 keeping them completely independent.
 
Something just occurred to me here for the eventual APU releases.

(Warning it’s late and this may be stupid)

When they want to release a series with beefed up graphics do you think they will just change out the IO die?

I have to imagine it’s easier to build a GPU then add on the IO logic then it is to design a CPU then also cram a GPU up inside it.

Just a random 1am thought.
 
Something just occurred to me here for the eventual APU releases.

(Warning it’s late and this may be stupid)

When they want to release a series with beefed up graphics do you think they will just change out the IO die?

I have to imagine it’s easier to build a GPU then add on the IO logic then it is to design a CPU then also cram a GPU up inside it.

Just a random 1am thought.
I always assumed they’d do a gpu chiplet, and have it communicate through then IO die with the cpu. The gpu portion of things tends to be pretty big, not sure if putting it in the IO would yield well enough.
 
You know what? igpu? Bring it. As a custom watercooler, swapping in a different gpu or even just plugging in a spare gpu to test something is a massive pita at times. There's so many times I wish my 5600x had a simple igpu for just video out for testing.

If it adds no cost and doesn't take away any features then yes, please.
 
Some people just like to bitch just to bitch. Really fussing about .1 of a watt? It has always been possible to disable iGPUs in the bios. Again they wouldn't have used the space the igpu is using to add more stuff like cache for the CPU.
 
I've wished for an igpu several times when having issues. I'm glad AMD are including it.

Yeah a basic integrated unit would be super nice. Doesn't need to be fancy but just enough to run windows and youtube.
 
On a 170-230w part?
Yes. I power limit my 5950s via eco mode, power limit my videocards via nvidia smi, and disable as much of the junk that comes with motherboards as possible. Points per watt is the key performance measurement when you're running distributed computing 24x7x365.
 
I get it, but if .1 watts really matters, I’m not sure x86 is the right place to be doing your computing.
I mean, that link WAS to power tweaking raspberry pis, and it was doing the same basic thing that I'm talking about here - getting rid of the junk. Disable as much of the iGPU as possible. Turn off the LEDs. Reduce networking to 100mbit. Etc.

The issue with the rasppberry pis (as well as most consumer ARM hardware) is that they use 28nm for making those chips. Arm is pretty awesome at efficiency, but a well tweaked 5950 is more efficient at points per watt than a raspberry pi is. If I could buy a few of these (https://d1o0i0v5q5lp8h.cloudfront.n...uments/Altra_Max_Rev_A1_DS_v1.00_20220331.pdf) like I can a 5950, I would likely be running ARM.

Edit:
Now imagine you're doing this over and over again - disabling iGPUs, disabling onboard sound, disabling disco lights, disabling USB, disabling SATA, etc, etc, etc. You might start to get a little bit bitter at all the literal crap the manufacturers include that would be better left as expansion cards. If the transistor count can't be used for something useful, I'd prefer they just left it out. That simple GPU in the new Ryzens is certainly a multi million transistor expansion - it would be so much better as something generally useful, even if it was some dram for the io. If it can't be something useful, just leave it off - and iGPU is NOT useful on a high end CPU.
 
Last edited:
I mean, that link WAS to power tweaking raspberry pis, and it was doing the same basic thing that I'm talking about here - getting rid of the junk. Disable as much of the iGPU as possible. Turn off the LEDs. Reduce networking to 100mbit. Etc.

The issue with the rasppberry pis (as well as most consumer ARM hardware) is that they use 28nm for making those chips. Arm is pretty awesome at efficiency, but a well tweaked 5950 is more efficient at points per watt than a raspberry pi is. If I could buy a few of these (https://d1o0i0v5q5lp8h.cloudfront.n...uments/Altra_Max_Rev_A1_DS_v1.00_20220331.pdf) like I can a 5950, I would likely be running ARM.

Edit:
Now imagine you're doing this over and over again - disabling iGPUs, disabling onboard sound, disabling disco lights, disabling USB, disabling SATA, etc, etc, etc. You might start to get a little bit bitter at all the literal crap the manufacturers include that would be better left as expansion cards. If the transistor count can't be used for something useful, I'd prefer they just left it out. That simple GPU in the new Ryzens is certainly a multi million transistor expansion - it would be so much better as something generally useful, even if it was some dram for the io. If it can't be something useful, just leave it off - and iGPU is NOT useful on a high end CPU.
It is useful. It's just not useful to you.

Also, why would I disable SATA, USB, and sound? Those should all be expansion cards? What is this the 90s and 00's still?
 
I always assumed they’d do a gpu chiplet, and have it communicate through then IO die with the cpu. The gpu portion of things tends to be pretty big, not sure if putting it in the IO would yield well enough.
But the IO is now pretty small and now on the same process as the 6x50 series GPUs. So so I just thought we’ll I mean it’s there, just slap the IO on a GPU and call it done.
 
Now imagine you're doing this over and over again - disabling iGPUs, disabling onboard sound, disabling disco lights, disabling USB, disabling SATA, etc, etc, etc. You might start to get a little bit bitter at all the literal crap the manufacturers include that would be better left as expansion cards
One can imagine that, but one would have a hard time imagining one would on a desktop PC, how many Kw/H a year are we talking ?

Rasberry pi/Arduino running on 5v battery somewhere is completely different.
 
I mean, that link WAS to power tweaking raspberry pis, and it was doing the same basic thing that I'm talking about here - getting rid of the junk. Disable as much of the iGPU as possible. Turn off the LEDs. Reduce networking to 100mbit. Etc.

The issue with the rasppberry pis (as well as most consumer ARM hardware) is that they use 28nm for making those chips. Arm is pretty awesome at efficiency, but a well tweaked 5950 is more efficient at points per watt than a raspberry pi is. If I could buy a few of these (https://d1o0i0v5q5lp8h.cloudfront.n...uments/Altra_Max_Rev_A1_DS_v1.00_20220331.pdf) like I can a 5950, I would likely be running ARM.

Edit:
Now imagine you're doing this over and over again - disabling iGPUs, disabling onboard sound, disabling disco lights, disabling USB, disabling SATA, etc, etc, etc. You might start to get a little bit bitter at all the literal crap the manufacturers include that would be better left as expansion cards. If the transistor count can't be used for something useful, I'd prefer they just left it out. That simple GPU in the new Ryzens is certainly a multi million transistor expansion - it would be so much better as something generally useful, even if it was some dram for the io. If it can't be something useful, just leave it off - and iGPU is NOT useful on a high end CPU.
There are a lot of compute tasks where a GPU will be more efficient than a CPU. In a compute environment such as that why not leverage them and actually save on overall consumption?
 
It is useful. It's just not useful to you.

Also, why would I disable SATA, USB, and sound? Those should all be expansion cards? What is this the 90s and 00's still?
I haven’t used SATA in most of my systems for years now — The only system that has SATA drives is my freenas. I put exclusively NVME in everything else, so sata gets disabled and has become dead transistors. It wouldn’t bother me in the slightest to have to buy a SATA controller for my NAS, or buy a specific NAS MB. Wanting SATA in every MB at this point would be similar to wanting a built in IDE controller on your MB…

Sound wise, I have never used integrated sound in any system I’ve had - and that goes back to my pentium 2s. Back then I used sound blasters, now I just use my headset which has it integrated already. Integrated sound is, and has always been, dead transistors.

As for USB, I disable it in all my Pis and any headless system as I do management remotely anyway. In my box that I also use for gaming, I only need 2 USB ports. When the MB allows it, I’ll disable additional controllers.
 
There are a lot of compute tasks where a GPU will be more efficient than a CPU. In a compute environment such as that why not leverage them and actually save on overall consumption?
I already have a X080 card in my system anyway, and fully support offloading to those when it makes sense. For example, I run folding @home on my GPU because it’s way more efficient than running it on CPU.
 
I haven’t used SATA in most of my systems for years now — The only system that has SATA drives is my freenas. I put exclusively NVME in everything else, so sata gets disabled and has become dead transistors. It wouldn’t bother me in the slightest to have to buy a SATA controller for my NAS, or buy a specific NAS MB. Wanting SATA in every MB at this point would be similar to wanting a built in IDE controller on your MB…

Sound wise, I have never used integrated sound in any system I’ve had - and that goes back to my pentium 2s. Back then I used sound blasters, now I just use my headset which has it integrated already. Integrated sound is, and has always been, dead transistors.

As for USB, I disable it in all my Pis and any headless system as I do management remotely anyway. In my box that I also use for gaming, I only need 2 USB ports. When the MB allows it, I’ll disable additional controllers.
OK, well I feel like you are an edge case and don't speak for the majority here. Not sure why anybody would otherwise give two shits about a 10th of a watt for their gaming PC. Not saying you're wrong or anything, just saying guess we'll agree to disagree on what's useful or not.
 
I haven’t used SATA in most of my systems for years now — The only system that has SATA drives is my freenas. I put exclusively NVME in everything else, so sata gets disabled and has become dead transistors. It wouldn’t bother me in the slightest to have to buy a SATA controller for my NAS, or buy a specific NAS MB. Wanting SATA in every MB at this point would be similar to wanting a built in IDE controller on your MB…

Sound wise, I have never used integrated sound in any system I’ve had - and that goes back to my pentium 2s. Back then I used sound blasters, now I just use my headset which has it integrated already. Integrated sound is, and has always been, dead transistors.

As for USB, I disable it in all my Pis and any headless system as I do management remotely anyway. In my box that I also use for gaming, I only need 2 USB ports. When the MB allows it, I’ll disable additional controllers.
All of that is effectively built into the CPU - even if you don't want it, it's on the CPU die, because a lot of folks ~do~ - and the majority of users aren't the folks on [H] either way.
I'm the opposite edge case - systems might start with only NVMe, but a LOT of mine (in use today) end up with piles of drives, extra ports, extra nics/etc added in to them. I want the features on the board - it might not get used today, but it may get used tomorrow. And I want expansion on TOP of that.
OK, well I feel like you are an edge case and don't speak for the majority here. Not sure why anybody would otherwise give two shits about a 10th of a watt for their gaming PC. Not saying you're wrong or anything, just saying guess we'll agree to disagree on what's useful or not.
This. Folding/WCG is only a niche use case still (and I was a member of the team here for a while). He has an edge case - I have the opposite edge case.
 
One other thing is with the onboard cpu disabled, it will act as extra heat absorbsion for the IO die. So it can be useful even being deactivated.
 
Back
Top