$119 HDMI Cable Has a Built-In Anti-Aliasing Chip to Remove Jaggies

Megalith

24-bit/48kHz
Staff member
Joined
Aug 20, 2006
Messages
13,000
Would you spend north of $100 on an HDMI that promises better visuals? Marseille's mCable Gaming Edition has an embedded video signal chip that offers contextual anti-aliasing. PC Perspective decided to test one and found that it does work as advertised, without adding input lag.

Initially, an HDMI cable that claims to improve image quality while gaming sounds like the snake oil that "audiophile" companies like AudioQuest have been peddling for years. However, looking into some of the more technical details offered by Marseille, their claims seemed to be more and more likely. By using a signal processor embedded inside the HDMI connector itself, Marseille appears to be manipulating the video signal to improve quality in ways applicable to gaming.
 
Don't know. I consider myself a bit of a purist for visuals and prefer to see it as it's given - blemishes and all.
 
Makes the argument that consoles are competitively cheaper weaker if you have to buy a hardware AA device when I can add SMAA to any game via shader injection for free to PC games.
 
Pretty cool cable... only use I can see is like he said your using it for older game consoles (xbox 360, ps3). Can't really see a use for it for pc gaming.
 
Makes the argument that consoles are competitively cheaper weaker if you have to buy a hardware AA device when I can add SMAA to any game via shader injection for free to PC games.
Until it gets you banned in said game. And that does happen. Not to say I do not somewhat agree with you I just thought I would point out injectors get cheat detect caught very frequently these days as the method used is also used to cheat so they are typically not smiled upon.
 
This was a good review. I would wait for the technology to get a little better, and the price to come down significantly....like around $25.
 
Pretty cool idea.

Would be sweet if it could do more things like Tonemapping, Color Correction, Increase Saturation, Bloom, etc. Basically many things that post-processors already do.

It could link to bluetooth app on phone to be able to change settings via the phone app.

Really only useful for consoles, PC's can do all of this and better already.
 
Until it gets you banned in said game. And that does happen. Not to say I do not somewhat agree with you I just thought I would point out injectors get cheat detect caught very frequently these days as the method used is also used to cheat so they are typically not smiled upon.

How exactly would the console know you have such a piece of hardware? EDID? I would think the device should act as a pass-through for EDID so that the television's EDID was detected.
 
How exactly would the console know you have such a piece of hardware? EDID? I would think the device should act as a pass-through for EDID so that the television's EDID was detected.

I think he was referring to the SMAA injecting method on the PC.
 
I dont get it. I been reading this website for 15 years...and see people spend 100-1000$ a year to get a bettter gpu to reduce aa but wont spend $100?
 
I dont get it. I been reading this website for 15 years...and see people spend 100-1000$ a year to get a bettter gpu to reduce aa but wont spend $100?

Yeah, but many of these "features" can ruin visual quality and are usually tuned way down or disabled when processing the image of the TV. In a game that it is either enabled or scaled to higher resolutions so as not needing to-both require processing power and memory bandwidth to scale.

As for me, nah, I calibrated four color processing back in the day. Most of the image processing disabled for me and this cable would be a waste-nothing to invest in.
 
Why can't we get this video stream processing chip into the display/TV itself?
 
I remember seeing ads in PC Gamer for the very first GPUs, like the Matrox Mystique, priced at an insane $190.

Now there's interconnect cables reaching towards that price point.

I don't know which seems more crazy...
 
Imagine playing Gran Turismo games on a old PS2 via this HDMI cable or the Gran Turismo game that was on the PS3 and this cable has more value to it's user. Definitely not for most PC gaming purists but for console gamers that use older game consoles with their beloved older games.
 
And then if the cable breaks you have tyo buy another.

Why not have an external box that has in/out connections.

That way, you can just buy the box and use cheapo HDMI cables?
 
Seems like it would run the risk of some serious latency, even if they claim otherwise. For 100+ for a cable I'd need some pretty rigorous testing.
 
Makes the argument that consoles are competitively cheaper weaker if you have to buy a hardware AA device when I can add SMAA to any game via shader injection for free to PC games.

SMAA doesn't really work all that great in most games. No AA clears all jaggies. 1080, even with the best AA, always looks a bit jaggy. 1440P notably reduces it though, and I find some AA + 1440 clears them up fairly well. But if this cable more or less removes them completely, that is certainly a nice upgrade.

If true and works as advertised, it would be nice if this and G-Sync became standard on all low-mid range cables/monitors.
 
To make matters clear its marketed for console gaming

I'd still be interested how it looks on PC at 4k with AA that comes fps free

Still
Should be just a tad cheaper, then it would be a no brainer
Should have an off switch in case it looks funny
And I think there is no HDR support, for consoles/TV kinda a big miss

I remember seeing ads in PC Gamer for the very first GPUs, like the Matrox Mystique, priced at an insane $190.

Now there's interconnect cables reaching towards that price point.

I don't know which seems more crazy...

But those 190$ are worth more in today' $ aren' they?
 
What i find a bit impressive is sub-1ms to decode HDCP, do it's stuff then re-encode 1080p @ 120fps
 
Pretty cool idea.

Would be sweet if it could do more things like Tonemapping, Color Correction, Increase Saturation, Bloom, etc. Basically many things that post-processors already do.

It could link to bluetooth app on phone to be able to change settings via the phone app.

Really only useful for consoles, PC's can do all of this and better already.

Daddy is it true when you were a kid that no speaker cables had internet access?

Yes that is true pumpkin.

(Although thinking about, this would FINALLY be a reason for HDMI to have Ethernet. Who would of thought that the first device to use Ethernet over HDMI,might be the HDMI cable itself???)
 
Post-processing methods of AA blur all edges on screen, which is bearable for polygons but downright awful for textures. By design, they can never be as good as the proper AA techniques. As several have pointed out already, this is useful for consoles, esp. older ones. On PC, you can force FXAA, which will give you similar, if not better results for barely a couple of frames.
 
This is my response on LInus's video.


This is not AA it is just some unsharpen mask like bullshit.
The aliasing is still clearly very present. It is impossible for it to do any actually AA without access to rendering buffers in the game other than to run edge detect and maybe low pass filtering the image as a flat 2d plane.
The fact you have to run at a lower than native res just adds to the bullshit for the upscaling. (because it isn't upscaling if your tv is still doing the work dummy, it just uses a higher sharpening setting to send your tv a sharper image. Here's a tip, just turn your TVs various sharpening methods up to 11)
Anyone who knows what AA really looks like can easily see this is nonsense. The temporal aliasing left alone is an obvious give away to the bullshitt. This does nothing more than what the processing in tvs do to make the image look worse.(Yes, I have spent 6 years trying to get the best AA out of games on PC possible and have spent countless hours looking at differences in aliasing in motion and statically. To see the wool they are trying pull over your eyes is obvious. Don't be a sheep. They are counting on you not knowing the difference to up sell you.)
PS Unreal can easily support high res with real AA that looks leagues better. Even running real AA at 480p upscaled would look better than this garbage.

The fact there is ringing all over the image gives away the fact they are simply aggressively filtering the image and sharpening it with processing to compensate for the blurring.(low pass filtering. Does said pixel area pass threshold? Filter it, then sharpen) And I am seeing this on a phone screen with over 400ppi. I can only imagine how horrible it looks on a normal screen. Hahaha now that I look at it more there is actually more aliasing left visible than with no AA because the egregious amount of sharpening used. This is no different than using FXAA and sharpening after the fact.

No one who knows the better should be falling for this crap.

It's basically SweetFX oversharpened nonsense (Because so many comparisons still using Reshade today that I see are nonsensical amounts of sharpening)put into a 150$ cable with no flexibility to do what you want with it.
 
Post-processing methods of AA blur all edges on screen, which is bearable for polygons but downright awful for textures. By design, they can never be as good as the proper AA techniques. As several have pointed out already, this is useful for consoles, esp. older ones. On PC, you can force FXAA, which will give you similar, if not better results for barely a couple of frames.
No it's not really, as tons of textures are really aliased in games. The more high frequency information present in textures in combination with whether they are using mip maps and what kind of mip map streaming set up they have, the more aliasing is present and thus the more texture shimmering exists.

The problem however, is that PPAA can decently attenuate this information(And it's often done slightly too aggressive), but it cannot do so in a temporally stable manner as it changes frame to frame and all it has no sub pixel information to work with.
Hence it doesn't help solve temporal aliasing issues.

High frequency surface aliasing is often perceived as detail, but it actually isn't. It's just more undersampling in the image that is an artifact from how it is rendered. There is a balance however, but more often than not it's egregious.


Also: Let it be known, that PPAA like FXAA and SMAA work very well when used in conjunction with SSAA like downsampling specifically. As it can be applied to the image pre-resolve. Which, after the fact negates any negative drawbacks of using it. And it only further improves image quality and most importantly edge quality. But most people dont' seem to be aware of this, and only have seen how it works at native res. Which is better than nothing. But not great.
http://screenshotcomparison.com/comparison/119174 (As a simple example)


FXAA is at least better than the troubling current trend of TAA techniques that are absolute garbage. These techniques (like UE4 TAA, Unity TAA, Vermintide TAA, there are countless examples.), super aggressively filter every surface in the game as soon as there is movement and then try to sharpen them afterwards. Turning the image and most surfaces into a splotchy looking mess, while leaving behind often a bunch of still aliased geometric edges along with many other artifacts such as after images, clamping+screen space discontinuity problems that lead to split second break ups of parts of the image. These completely go against the purpose of what AA is supposed to do. Resolve the information beneath the undersampling.
Example pictures:
https://u.cubeupload.com/MrBonk/vermintide2017090200.png (Notice the after image artifacts to the right of the sword, the cups, candles and stools, the splotchiness of the image, especially what were once high frequency information areas like the sword near the specular highlight. Poor edge quality and broken edges)

https://u.cubeupload.com/MrBonk/highressplotchy.jpg (this is from a high res capture to really showcase how they are aggressively filtering surfaces resulting in loss of actual detail into what looks like a photoshop paint filter)
https://u.cubeupload.com/MrBonk/fastmotion.png (Same from unity. For contrast, here's unity with zero movement at all https://u.cubeupload.com/MrBonk/asdfe.png. They all look OK ish without any movement at all. But that's the problem.)
https://u.cubeupload.com/MrBonk/wp0008.jpg standing still. https://u.cubeupload.com/MrBonk/wp0009.jpg when you start to move.



UE4's TAA has evolved and gotten more inconsistent and awful with time. The only thing that isn't as bad as it used to be is there used to be this horribly weird warping like motion effect with foliage in older builds of the engine. That's not there any more, but it generally looks worse overall.
I had some great captures of knack showing these same TAA problems. But there are somewhere else at the moment.

Some TAA is good, like SMAA T2x (Which shines with SSAA on top.). REVII's TAA overly softens the image but doesn't have as much as the surface filtering issues. (Turn off SSReflections though. Holy jesus those are bad and using TAA just makes them look super weird.). With SSAA it really shines.

But most of it is garbage that is trying too hard to get good AA for basically the same cost as FXAA or barely more, and they are taking more and more from the likes of the above examples.
 
From what I've seen this cable is better than no AA, but "real" AA is (visually) substantially better still.
I can't see it being an option for PC gaming, but useful for watching low res movies on a high res screen or simply playing console games.

... No AA clears all jaggies. 1080, even with the best AA, always looks a bit jaggy. ...
Only when the pixels are too large. Take your typical 5" 1080p smartphone and look at it from about one meter. Does it look jaggy? And if you apply 4x FSAA?
(I think FSAA, introduced about the same time as GeForce 2, is used way too little nowadays that we have the processing power to handle it...)
 
For those who don't like reading.

I'm surprised with the cable, and Linus did a good job.
I'm not sure If I would benefit by using it with my fire TV and streaming hulu, neflix and so on (I am not clear on the application limits at moment, settings and stuff, I will read later), but if I can use it, some of the thing it does I would benefit from, sometime the streaming quality is not that awesome I mean.
 
I think that I may pick one up to use from an OG Wii that has homebrew, I currently use an WIIHD adapter, and still the picture quality is trash. I will report back.
 
No, No, and NO!

The point to digital is that faults are immediately detectable and there is no dispute over signal quality. As everything and its uncle it doing error correction and enhancement, you lose that. So, is your picture bad because the Cable company is over-compressing, or do you have a bad cable? He Said / She Said was a hurdle you had to get over with the cable company in the Analog days.
 
There is no way this cable-based AA can be anything more than a "smart" blur filter - it can't use the depth buffer to help find edges.

Its a neat idea for consoles though.
 
Has anyone other than Linus tested it? I really am curious, Zelda on the WII looks like crap, I know that I can play on Dolphin, but it just works on the wii without me messing with a ton of settings.
 
I give this a "meh" all around. The marketing is pure snake oil (it's NOT AA in any way, shape, or form) but in some cases it can improve a shit image. This reminds me of how we used to use to "save" a shitty picture at a photostudio. Take the image, create a duplicate layer and convert it to black and white, invert colors, apply a high-pass filter (usually less than 5px) and then change the opacity to fit the image (usually less than 25%). Adds contrast to the edges while gently blurring any detail - hiding any focus problems. Really only useful if your source is a total trashfire, similar to the cable. Who wants to bet Linus got paid for that glowing review?
 
I actually wanted to do this kind of thing with SweetFX/reShade for post process using a secondary GPU awhile back it's not quite the same thing, but a step in that direction at least. I wish you could do something similar to SoftTH with a secondary GPU and clone the active program rendering and then apply post process injection to the cloned secondary display via the secondary GPU. This is cool, but no where near as cool as utilizing a secondary GPU exclusively for post process effects injection would or could be if that ever came about. Hell that's what Lucid should have probably done it would be better and more reliable than CF/SLI ever has been and less restrictive.
 
How exactly would the console know you have such a piece of hardware? EDID? I would think the device should act as a pass-through for EDID so that the television's EDID was detected.
This has nothing to do with the cable. Was referring to the person whom I replied to that said use a SMAA injector on their PC.
 
Back
Top