AMD announces Ryzen 7000 Zen 4 CPUs

I get it, what I am most interested about is what the system will do with those onboard GPU’s when a discrete one is added. Will it “crossfire” there are plenty of things it could offload that aren’t graphics related that run n the GPU. It could handle texture decompression, physics, probably other things too. Would decrease both CPU and GPU load and squeeze out a few extra frames, if they actually support it at a driver level.

+1 on this. I've always thought they should try to exploit the onboard GPU as much as possible, although I don't know what kind of technical issues come along with that. It's unfortunate having compute units just sitting there and doing nothing, though.
 
Yeah, but if we go by Intel as a guide and AMD follows along their pricing formula, the difference in price between a CPU with a few onboard GPU cores and one that has zero is around $10, which to me is worth it for a) diagnostic purposes, and b) giving to other members of my family when I upgrade who don't need a GPU, which is what typically happens to my older systems. In the grand scheme of things, $10 isn't going to move the needle on anyone's budget, and if it does then you're likely not in the market for a $1500-$2000 gaming or workstation PC anyway.

Well, the CPU is not the only area where there is feature bloat. With RGB LED's added to absolutely everything, and any feature and chip you could think of being added to motherboards, eventually with $10 for a feature here, and $10 for a feature there it adds up to real money.

That, and every single feature, every single trace needlessly added to a board adds an opportunity for a defect. Something that will either cause compatibility issues, or leakage current, or something else, you name it.

It doesn't matter what you are designing, be it a computer component, a car or a happy meal, minimalism is ALWAYS the best solution.

If there is a feature that isn't strictly necessary, cut it. it will only add cost and cause problems.

As the first thing they teach you in engineering school goes: KISS. Keep It Simple, Stupid.

Make absolutely everything minimalist.
 
Well, the CPU is not the only area where there is feature bloat. With RGB LED's added to absolutely everything, and any feature and chip you could think of being added to motherboards, eventually with $10 for a feature here, and $10 for a feature there it adds up to real money.

That, and every single feature, every single trace needlessly added to a board adds an opportunity for a defect. Something that will either cause compatibility issues, or leakage current, or something else, you name it.

It doesn't matter what you are designing, be it a computer component, a car or a happy meal, minimalism is ALWAYS the best solution.

If there is a feature that isn't strictly necessary, cut it. it will only add cost and cause problems.

As the first thing they teach you in engineering school goes: KISS. Keep It Simple, Stupid.

Make absolutely everything minimalist.
That is not what sells.
 
If there is a feature that isn't strictly necessary, cut it. it will only add cost and cause problems.
With how popular on board ethernet, on board sound, extra fan connector and USB has been for them I do not see that happening. Maybe it exists system that have nothing and let people plug PCI adapter for all they really need, but they didn't sell.

I am with 99% people that give me usb-sound-ethernet-fan headers even if I do not use them once the price point relative to have a RAM-CPU-VIdeo card system.

Some trace are quite simple and does not matter if they stop to work (the rgb stuff/fan stuff)
 
That is not what sells.

I'd buy it.

In just about everything and anything. I'll pay more money for quality/reliability/sturdiness but I will always opt for th eproduct with the lest number of "features" as they always cause problems.

There ought to be enough of us that there is at least a market for these products.

But it is true. In general the consumer is the worst enemy of good product design. The consumer/customer is the lowest common denominator.
 
Last edited:
Good god why? If I wanted integrated video, I’d buy an APU. Take those same transistors and, if nothing else, make them cache. I hate buying junk I have to turn off, and an iGPU is junk I have to turn off.
iGPUs are incredibly useful to have for troubleshooting, or sometimes you just need a basic video out without a card. It's fine.
 
But it is true. The consumer is the worst enemy of good product design.
Do you think cpu/mb coming up with ethernet/USB/sound/place to connect fans/etc... list of non-essential feature not everyone use, really a bad product design ?

I much prefer having them
 
Do you think cpu/mb coming up with ethernet/USB/sound/place to connect fans/etc... list of non-essential feature not everyone use, really a bad product design ?

I much prefer having them
Maybe we can have the fans hard wired to the board so they aren’t user replaceable. Or have the USB ports as part of the motherboard?
 
Do you think cpu/mb coming up with ethernet/USB/sound/place to connect fans/etc... list of non-essential feature not everyone use, really a bad product design ?

I much prefer having them

The only one of those I'd keep are the fan connectors, because how speed control is linked to chip temperatures, which is better integrated.

Everything else I'd rather have on a separate add-in board.

Right now on my desktop board I have an overly complicated dual chip Realtek sound solution which I don't use, because it is stupid, and I have a USB DAC anyway. I have dual gigabit Ethernet ports which I do not use for two reasons. Firstly because I don't trust Realtek crap. Realtek can't make a good ethernet solution to save their lives. There is only one good Ethernet solution, and that is Intel. (Well, maybe Broadcom too).

Even Aquantia which is so popular these days is totally subpar when compared to Intel NIC's.

1654034931700.png


Also because I have an addon board with 10gig SFP+/fiber networking which I use to run two fiber lines, one to my switch, and the other is a direct link to my NAS box.

While in theory I would prefer to have the USB ports as a separate add-on, they are kind of integrated into the chipset these days, so those I could live with, but with everything else, I like choosing the solution that best suits my needs, not just living with whatever the motherboard maker decided was a cool feature to integrate.

I'd to go back to something like 286 days where you just had a ton of slots, and nothing on board.

1654032274300.png


You want a video card? You add a video card. You want a hard drive? Stick a hard drive controller into a slot. You want a floppy drive? Stick a floppy controller into one of those slots. You want an FPU? Stick one in the FPU socket. You want sound? Stick a sound card into a slot. You want USB? Stick a USB controller into a slot.

This way I get to choose what is best for ME and not just use some pre-fab crap chosen by a motherboard vendor because it meets the needs of the lowest common denominator who doesn't know shit about computers and just wants a "gaming pc".

Details matter, and I want to choose each nitty gritty detail for my build, including which Ethernet chip my system has, which USB controller chip it has, whether or not I want any SATA ports. The more of these decisions that have been made for me by a product, to me, the worse that product is.

The PC is the Personal Computer. Every last aspect of it is supposed to be personalized to me!
People talk about building custom PC's, but that hasn't existed for years. There is very little that is custom now. It's just a matter of selecting the prefab motherboard you want, and a GPU.
 
Last edited:
I hope they move the X5 SKU to 8 cores.

Both AMD and Intel have been sitting on 6/12 x5s for four generations.

i3/r3 = 6/12
i5/r5 = 8/16
i7/r7 = 8p8e or 12/24
i9/r9 = 8p16e or 16/32

Thats a world I can believe in.
i5-12600K is 6 P cores and 4 E cores. It outperforms the 11900k in every single situation.

The non-K 12th gen i5 are all 6 P cores.

13th gen Raptor lake SKUs have all been leaked, and there are some non-k i5 with 6 P cores and 4 E cores.
 
The only one of those I'd keep are the fan connectors, because how speed control is linked to chip temperatures, which is better integrated.

Everything else I'd rather have on a separate add-in board.

Right now on my desktop board I have an overly complicated dual chip Realtek sound solution which I don't use, because it is stupid, and I have a USB DAC anyway. I have dual gigabit Ethernet ports which I do not use for two reasons. Firstly because I don't trust Realtek crap. Realtek can't make a good ethernet solution to save their lives. There is only one good Ethernet solution, and that is Intel. (Well, maybe Broadcom too) Also because I have an addon board with 10gig SFP+/fiber networking which I use to run two fiber lines, one to my switch, and the other is a direct link to my NAS box.

While in theory I would prefer to have the USB ports as a separate add-on, they are kind of integrated into the chipset these days, so those I could live with, but with everything else, I like choosing the solution that best suits my needs, not just living with whatever the motherboard maker decided was a cool feature to integrate.

I'd to go back to something like 286 days where you just had a ton of slots, and nothing on board.

View attachment 478769

You want a video card? You add a video card. You want a hard drive? Stick a hard drive controller into a slot. You want a floppy drive? Stick a floppy controller into one of those slots. You want an FPU? Stick one in the FPU socket. You want sound? Stick a sound card into a slot. You want USB? Stick a USB controller into a slot.

This way I get to choose what is best for ME and not just use some pre-fab crap chosen by a motherboard vendor because it meets the needs of the lowest common denominator who doesn't know shit about computers and just wants a "gaminc pc".

Details matter, and I want to choose each nitty gritty detail for my build, including which Ethernet chip my system has, which USB controller chip it has, whether or not I want any SATA ports. The more of these decisions that have been made for me by a product, to me, the worse that product is.

The PC is the Personal Computer. Every last aspect of it is supposed to be personalized to me!
People talk about building custom PC's, but that hasn't existed for years. There is very little that is custom now. It's just a matter of selecting the prefab motherboard you want, and a GPU.
Agree for the most part. I mean, I'd prefer if the chips I wanted were integrated, but since that rarely is the case (except maybe one or two), having them omitted would be my next choice.
 
Do we have confirmation if the built in GPU is HDMI 2.1?

If that's the case, it's potentially pretty useful for certain kiosk/embedded scenarios where you want a small/quiet PC, have an HDMI 2.1 display output, but don't want heat/noise of a discrete GPU.
 
Agree for the most part. I mean, I'd prefer if the chips I wanted were integrated, but since that rarely is the case (except maybe one or two), having them omitted would be my next choice.

I'm partially with you. If boards just came with everything I wanted on them, I wouldn't have as strong opinions on the matter, but I still think I'd prefer just having more expansion slots and getting to customize things.

Firstly, it allows for parts to more easily be repurposed. Maybe something was used in a PC for games at first, but down the road you want to use it as an HTPC, or a server, or something else. Just take the parts that don't make sense in the new role out, and replace them with parts that do. It's a great level of flexibility.

I also liked the customization aspect. Spending hours pouring over specifications, and reading about the pros and cons of different options, aand then choosing the exact one that made sense for you. That was part of the charm of things. If one aspect of a build was important to you, you splurged and got the best component for that. If something else was less important to you, you either got a cheap part for that, or maybe even omitted it all together if it was unnecessary. There was a certain charm to PC building back then. Each build was truly an expression of what the user wanted and prioritized in their build. (or on budget builds, what they could afford) Something that was truly custom.

I miss that.

When I see something like the Asus WRX80 Threadripper Pro motherboard, the possibilities with all those expansion slots makes me salivate.

1654033769300.png


Take this motherboard, get rid of all the on board crap (Sound, Wifi, Ethernet, SATA, etc.) and you'd have "The Perfect Motherboard". A blank slate with tons of expansion on which you can build something that is your own.
 
I'm partially with you. If boards just came with everything I wanted on them, I wouldn't have as strong opinions on the matter, but I still think I'd prefer just having more expansion slots and getting to customize things.

Firstly, it allows for parts to more easily be repurposed. Maybe something was used in a PC for games at first, but down the road you want to use it as an HTPC, or a server, or something else. Just take the parts that don't make sense in the new role out, and replace them with parts that do. It's a great level of flexibility.

I also liked the customization aspect. Spending hours pouring over specifications, and reading about the pros and cons of different options, aand then choosing the exact one that made sense for you. That was part of the charm of things. If one aspect of a build was important to you, you splurged and got the best component for that. If something else was less important to you, you either got a cheap part for that, or maybe even omitted it all together if it was unnecessary. There was a certain charm to PC building back then. Each build was truly an expression of what the user wanted and prioritized in their build. (or on budget builds, what they could afford) Something that was truly custom.

I miss that.

When I see something like the Asus WRX80 Threadripper Pro motherboard, the possibilities with all those expansion slots makes me salivate.

View attachment 478776

Take this motherboard, get rid of all the on board crap (Sound, Wifi, Ethernet, SATA, etc.) and you'd have "The Perfect Motherboard". A blank slate with tons of expansion on which you can build something that is your own.

Not to mention it takes registered RAM.
 
Do we have confirmation if the built in GPU is HDMI 2.1?

If that's the case, it's potentially pretty useful for certain kiosk/embedded scenarios where you want a small/quiet PC, have an HDMI 2.1 display output, but don't want heat/noise of a discrete GPU.
Because it is not the first time I see this I am a bit curious what would be the use case from a small igpu for which HDMI 2.0 would not do the work ? The only scenario where I would want more than that, seem to require a better GPU than this.
 
Everything else I'd rather have on a separate add-in board.
I fully get you preference and reasoning, but that is different from bad design (and many motherboards integrated LAN is from intel), there is level of popularity on an I/O than having it integrated in the motherboard make a lot of sense, sata ports/nvme ports/usb/ethernet/fan headers, etc....

People can dislike it, could prefer an option without them (would be a market so niche that could possibly be end up not significantly less expensive with the low volume), but that does not make an motherboard maker putting SATA ports on them a bad design decision from their parts.
 
Do we have confirmation if the built in GPU is HDMI 2.1?

If that's the case, it's potentially pretty useful for certain kiosk/embedded scenarios where you want a small/quiet PC, have an HDMI 2.1 display output, but don't want heat/noise of a discrete GPU.
They say it is based on RDNA 2, and that the whole lineup supports HDMI 2.1 so it would be safe to assume that it supports HDMI 2.1 as well, but what are you doing at a kiosk where the difference would matter?
 
They say it is based on RDNA 2, and that the whole lineup supports HDMI 2.1 so it would be safe to assume that it supports HDMI 2.1 as well, but what are you doing at a kiosk where the difference would matter?

Arcade cabinet PC that needs variable refresh thus HDMI 2.1 for arcade games with weird refresh rates. It would be AWESOME to not need a discrete GPU anymore. 2D arcade emulation is basically the perfect scenario for this: not demanding on the GPU at all. It's all CPU.



I'm planning on mounting a 42" OLED in my arcade cabinet vertically, which is about the perfect 4:3 size for my cabinet, and I was dreading having to buy some 3000 series+ discrete card just to make it work.

With this, I could probably build a whole new cabinet PC for what I would have spent on the ludicruous GPU.
 
Last edited:
Arcade cabinet PC which needs variable refresh thus HDMI 2.1 for arcade games with weird refresh rates. It would be AWESOME to not need a discrete GPU anymore.


It’s too underpowered for that, these are Intel iGPU levels of performance. Your going to want to wait for the next APU release set, I’m hoping I can do this with with a 7700G whenever they get around to launching that.
 
It’s too underpowered for that, these are Intel iGPU levels of performance. Your going to want to wait for the next APU release set, I’m hoping I can do this with with a 7700G whenever they get around to launching that.

I don't think so. Emulators like MAME literally don't use your GPU at all. I've run them on Intel iGPUs, and it was fine. Its Direct3D backend literally draws two triangles and that's it. All that was missing was VRR.
 
I don't think so. Emulators like MAME literally don't use your GPU at all. I've run them on Intel iGPUs, and it was fine. Its Direct3D backend literally draws two triangles and that's it. All that was missing was VRR.
I was going to say, depend of emulator level obviously but we ran some on 4x86 computer with no video card and on Pentium 200 mmx back in the days.

Would a monitor/display port with freesync not do the trick here vs a TV ? Depend on the size here I guess.
 
I was going to say, depend of emulator level obviously but we ran some on 4x86 computer with no video card and on Pentium 200 mmx back in the days.

Would a monitor/display port with freesync not do the trick here vs a TV ? Depend on the size here I guess.

Yeah, you might run into trouble with RPCS3 or Dolphin because they actually use your GPU, but I don't care about that crap. I only play old 2D arcades anyway.

So the problems with every FreeSync computer monitor:

1) too wide to fit in a cabinet designed for 4:3 screens horizontally
2) slim enough to fit in the cabinet but then way too small in terms of the 4:3 image you get on it
3) too small to take up enough of the cabinet if you mount it vertically

A 42" LG OLED solves all of those problems.

FreeSync support with TVs was always spotty, but HDMI 2.1 VRR is something that supposedly just works.
 
FreeSync support with TVs was always spotty, but HDMI 2.1 VRR is something that supposedly just works.
In that case, intel would already work for you I think, a 12100 has:
Graphics OutputeDP 1.4b, DP 1.4a, HDMI 2.1

I imagine it will get down the line has well if a 12100 is too hot.
 
In that case, intel would already work for you I think, a 12100 has:
Graphics OutputeDP 1.4b, DP 1.4a, HDMI 2.1

I imagine it will get down the line has well if a 12100 is too hot.

That's a good point. I wasn't thinking about Intel. I'll look into it. The thing is, it depends on the drivers. I think the vendor actually has to support VRR and it's optional, so I don't know if Intel did it. I know that AMD's discrete GPU HDMI 2.1 does support it though, so I don't see why they wouldn't with the built in graphics.

Seems like Intel is at least aware of it:
https://www.intel.com/content/www/u...able-refresh-rate-vrr-and-auto-low-81928.html
 
In that case, intel would already work for you I think, a 12100 has:
Graphics OutputeDP 1.4b, DP 1.4a, HDMI 2.1

I imagine it will get down the line has well if a 12100 is too hot.

Okay I looked into it, and the plot thickens. The other issue is that the hardware not only needs to support HDMI 2.1, but it also basically needs to be able to do 4k @ 120hz to really satisfy the requirements of every arcade game.

Here's the funny thing. Old games like Pacman and Donkey Kong actually run at 60.606061hz, so if you run them at 60hz, they hitch every few seconds because they can't sync.

So basically, you need > 60hz to run every arcade game in MAME perfectly.

Apparently the built in Intel graphics is limited to 60hz/4k even though it supports HDMI 2.1. The other thing is that the true analog refresh rate of each monitor is never exactly 60hz, and there are some arcade games that run very close to it at like 59.6hz, and there are certain 60hz monitors whose TRUE real world refresh rate is like 59.4hz, so even those will hitch in MAME.

The question is whether the built in graphics in the new AMD GPU will actually do 120hz/4k AND HDMI 2.1 VRR.
 
I was going to say, depend of emulator level obviously but we ran some on 4x86 computer with no video card and on Pentium 200 mmx back in the days.

Would a monitor/display port with freesync not do the trick here vs a TV ? Depend on the size here I guess.
yup, i ran zsnes on a 200mmx based laptop
 
I don't think so. Emulators like MAME literally don't use your GPU at all. I've run them on Intel iGPUs, and it was fine. Its Direct3D backend literally draws two triangles and that's it. All that was missing was VRR.
If your only concern is old MAME titles than yeah, this is fine. But if you want to go bigger…. I’ve got an old i5 with a 970 that I was humming over a 5700g to replace it. But the lack of VRR had me thinking of just snagging a second hand laptop with a 2060 and snagging a monitor that was GSync compatible. But I have hopes for the 7700G.
 
If your only concern is old MAME titles than yeah, this is fine. But if you want to go bigger…. I’ve got an old i5 with a 970 that I was humming over a 5700g to replace it. But the lack of VRR had me thinking of just snagging a second hand laptop with a 2060 and snagging a monitor that was GSync compatible. But I have hopes for the 7700G.

So for giggles, I did a query on all games in MAME that run above 60hz, and there are 824 of them. So there are 824 games that will hitch in MAME, even with a VRR monitor, if it doesn't run above 60hz.

unknown.png
 
Yeah, you might run into trouble with RPCS3 or Dolphin because they actually use your GPU, but I don't care about that crap. I only play old 2D arcades anyway.

So the problems with every FreeSync computer monitor:

1) too wide to fit in a cabinet designed for 4:3 screens horizontally
2) slim enough to fit in the cabinet but then way too small in terms of the 4:3 image you get on it
3) too small to take up enough of the cabinet if you mount it vertically

A 42" LG OLED solves all of those problems.

FreeSync support with TVs was always spotty, but HDMI 2.1 VRR is something that supposedly just works.
It doesn’t, if it did VESA wouldn’t have had to come out with the new badges. I’m hopeful for the new VESA Entertainment certified screens but I’ve been primarily looking at GSync screens for that reason.
 
The only one of those I'd keep are the fan connectors, because how speed control is linked to chip temperatures, which is better integrated.

Everything else I'd rather have on a separate add-in board.

Right now on my desktop board I have an overly complicated dual chip Realtek sound solution which I don't use, because it is stupid, and I have a USB DAC anyway. I have dual gigabit Ethernet ports which I do not use for two reasons. Firstly because I don't trust Realtek crap. Realtek can't make a good ethernet solution to save their lives. There is only one good Ethernet solution, and that is Intel. (Well, maybe Broadcom too).

Even Aquantia which is so popular these days is totally subpar when compared to Intel NIC's.

View attachment 478781

Also because I have an addon board with 10gig SFP+/fiber networking which I use to run two fiber lines, one to my switch, and the other is a direct link to my NAS box.

While in theory I would prefer to have the USB ports as a separate add-on, they are kind of integrated into the chipset these days, so those I could live with, but with everything else, I like choosing the solution that best suits my needs, not just living with whatever the motherboard maker decided was a cool feature to integrate.

I'd to go back to something like 286 days where you just had a ton of slots, and nothing on board.

View attachment 478769

You want a video card? You add a video card. You want a hard drive? Stick a hard drive controller into a slot. You want a floppy drive? Stick a floppy controller into one of those slots. You want an FPU? Stick one in the FPU socket. You want sound? Stick a sound card into a slot. You want USB? Stick a USB controller into a slot.

This way I get to choose what is best for ME and not just use some pre-fab crap chosen by a motherboard vendor because it meets the needs of the lowest common denominator who doesn't know shit about computers and just wants a "gaming pc".

Details matter, and I want to choose each nitty gritty detail for my build, including which Ethernet chip my system has, which USB controller chip it has, whether or not I want any SATA ports. The more of these decisions that have been made for me by a product, to me, the worse that product is.

The PC is the Personal Computer. Every last aspect of it is supposed to be personalized to me!
People talk about building custom PC's, but that hasn't existed for years. There is very little that is custom now. It's just a matter of selecting the prefab motherboard you want, and a GPU.
Love my aquantia nics. Cheaper new than most Intel used, lower power too. Not sure where your chart comes from but most folks seem to say they are very close performance wise

https://mightygadget.co.uk/startech-intel-x550-t2-dual-port-10-gigabit-ethernet-card-review/
 
iGPUs are incredibly useful to have for troubleshooting, or sometimes you just need a basic video out without a card. It's fine.
Welcome to why I’ve had a GeForce 710 for years. One GPU I can move around as necessary but generally hangs out in the box not drawing idle power and not wasting transistors on my CPU. The extra die space on the CPU could, if nothing else, be better used for cache.
 
Well, the CPU is not the only area where there is feature bloat. With RGB LED's added to absolutely everything, and any feature and chip you could think of being added to motherboards, eventually with $10 for a feature here, and $10 for a feature there it adds up to real money.

That, and every single feature, every single trace needlessly added to a board adds an opportunity for a defect. Something that will either cause compatibility issues, or leakage current, or something else, you name it.

It doesn't matter what you are designing, be it a computer component, a car or a happy meal, minimalism is ALWAYS the best solution.

If there is a feature that isn't strictly necessary, cut it. it will only add cost and cause problems.

As the first thing they teach you in engineering school goes: KISS. Keep It Simple, Stupid.

Make absolutely everything minimalist.
What you are describing is enterprise, not consumer or even prosumer.
A plain green PCB isn't going to sell as well as an RGB glowing PCB will to the average consumer, yet in enterprise an RGB glowing PCB would be laughed out of the building, and rightfully so.

Marketing is half the battle, and especially with who the product is being marketed for.
Not every customer is you, and projecting your specific needs onto every product is unnecessary to get your point across.
 
Arcade cabinet PC that needs variable refresh thus HDMI 2.1 for arcade games with weird refresh rates. It would be AWESOME to not need a discrete GPU anymore. 2D arcade emulation is basically the perfect scenario for this: not demanding on the GPU at all. It's all CPU.



I'm planning on mounting a 42" OLED in my arcade cabinet vertically, which is about the perfect 4:3 size for my cabinet, and I was dreading having to buy some 3000 series+ discrete card just to make it work.

With this, I could probably build a whole new cabinet PC for what I would have spent on the ludicruous GPU.

I did this with my lowly 4830 and now my 7870. Are you inventing problems here?
I did mine with LCD's though. Sorry if I am well off.
 
Last edited:
I did this with my lowly 4830 and now my 7870. Are you inventing problems here?
I did mine with LCD's though. Sorry if I am well off.

I don't get what you mean. Inventing what problems? What did you do with your 4830 and 7870?
 
Welcome to why I’ve had a GeForce 710 for years. One GPU I can move around as necessary but generally hangs out in the box not drawing idle power and not wasting transistors on my CPU. The extra die space on the CPU could, if nothing else, be better used for cache.
Yea but extra space would not be used for anything. They are not cutting down CPU to fit a GPU in there.
 
+1 on this. I've always thought they should try to exploit the onboard GPU as much as possible, although I don't know what kind of technical issues come along with that. It's unfortunate having compute units just sitting there and doing nothing, though.
They've tried - can't remember the name of the tech, but it's painful to coordinate and doesn't work all that well.
With how popular on board ethernet, on board sound, extra fan connector and USB has been for them I do not see that happening. Maybe it exists system that have nothing and let people plug PCI adapter for all they really need, but they didn't sell.

I am with 99% people that give me usb-sound-ethernet-fan headers even if I do not use them once the price point relative to have a RAM-CPU-VIdeo card system.

Some trace are quite simple and does not matter if they stop to work (the rgb stuff/fan stuff)
All of that is built into the CPU now - the PHY exists just to provide the physical connector on the board, but it's all built into the SoC now that is the CPU.
The only one of those I'd keep are the fan connectors, because how speed control is linked to chip temperatures, which is better integrated.

Everything else I'd rather have on a separate add-in board.

Right now on my desktop board I have an overly complicated dual chip Realtek sound solution which I don't use, because it is stupid, and I have a USB DAC anyway. I have dual gigabit Ethernet ports which I do not use for two reasons. Firstly because I don't trust Realtek crap. Realtek can't make a good ethernet solution to save their lives. There is only one good Ethernet solution, and that is Intel. (Well, maybe Broadcom too).
And the sound system is buggy as shit on the sTRX4 boards (first gen of the USB connected internally systems). I've got the same board - Linux guys are STILL struggling to get it working right.
Intel is solid, Mellanox is solid, Broadcom is hit or miss depending on the card (25G is not good, 10G is generally good now). Realtek? LOL.
Even Aquantia which is so popular these days is totally subpar when compared to Intel NIC's.

View attachment 478781
Curious where you got that from (I've had my own issues with those cards; have 4 of them).
Also because I have an addon board with 10gig SFP+/fiber networking which I use to run two fiber lines, one to my switch, and the other is a direct link to my NAS box.

While in theory I would prefer to have the USB ports as a separate add-on, they are kind of integrated into the chipset these days, so those I could live with, but with everything else, I like choosing the solution that best suits my needs, not just living with whatever the motherboard maker decided was a cool feature to integrate.

I'd to go back to something like 286 days where you just had a ton of slots, and nothing on board.

View attachment 478769

You want a video card? You add a video card. You want a hard drive? Stick a hard drive controller into a slot. You want a floppy drive? Stick a floppy controller into one of those slots. You want an FPU? Stick one in the FPU socket. You want sound? Stick a sound card into a slot. You want USB? Stick a USB controller into a slot.
Sadly, all of that is built into the CPU now. Like, ALL of it. Minus the floppy controller.
This way I get to choose what is best for ME and not just use some pre-fab crap chosen by a motherboard vendor because it meets the needs of the lowest common denominator who doesn't know shit about computers and just wants a "gaming pc".

Details matter, and I want to choose each nitty gritty detail for my build, including which Ethernet chip my system has, which USB controller chip it has, whether or not I want any SATA ports. The more of these decisions that have been made for me by a product, to me, the worse that product is.

The PC is the Personal Computer. Every last aspect of it is supposed to be personalized to me!
People talk about building custom PC's, but that hasn't existed for years. There is very little that is custom now. It's just a matter of selecting the prefab motherboard you want, and a GPU.
This is why I buy HEDT - although you're still stuck with a bunch of that being built into the CPU.
 
Yeah, you might run into trouble with RPCS3 or Dolphin because they actually use your GPU, but I don't care about that crap. I only play old 2D arcades anyway.

So the problems with every FreeSync computer monitor:

1) too wide to fit in a cabinet designed for 4:3 screens horizontally
2) slim enough to fit in the cabinet but then way too small in terms of the 4:3 image you get on it
3) too small to take up enough of the cabinet if you mount it vertically

A 42" LG OLED solves all of those problems.

FreeSync support with TVs was always spotty, but HDMI 2.1 VRR is something that supposedly just works.

How does an LG OLED solve the 4:3 sizing problem?
 
How does an LG OLED solve the 4:3 sizing problem?
The LG OLED displays support 144hz and VRR so you can get more accurate frame rates for older arcade titles that were all over the place FPS-wise because some actually run as low as 24 and others as high as 75 you need a screen that is capable of filling everything in between.
A 42" display when rotated vertically, ends up with a screen width of 20.6" and most arcade cabinets used a 19 or 21" 4:3 display so it's a decent middle ground if accuracy is what you are aiming for.
The only downside with OLED displays from what I have seen so far is that if you use a filter to emulate scan lines it creates uneven wear in OLED pannels that can cause burn-in and significantly shorten their lifespans.
 
Back
Top