Innodisk Launches 4K M.2 Graphics Card

Agreed with most of this, but I don't think the PCIe bandwidth will be of a huge concern.

Back in ~2010 for shits and giggles I built a custom enclosure and used an Expresscard to PCIe slot adapter on my work laptop and briefly ran a Radeon 6850 as an external GPU on it.

Sure, it lost a little bit of performance due to only being PCIe 1x, but not a ton. Less than 5%

Now, one could argue that a Radeon 6850 is a much older card and not representative of modern GPU bandwidth needs, but that was PCIe 1x (not sure if gen 1 or gen 2) and modern m.2 ports are up to 4x PCIe gen 3, so a modern m.2 slot can have 16-32 times more bandwidth.

I have a feeling that in single GPU configurations with VRAM on board, the PCIe bandwidth is really only relevant to level load times when textures are transferred and decompressed.

If you ahve insufficient VRAM, textures may be transferred live during game, but most of the time this is a level load task, I believe. (Though don't quote me, as I am not a subject matter expert here)

Yeah, I guess it's hard to disagree. Given the small footprint of an M.2 card that likely is the biggest limiting factor and not bandwidth. Though I still think x4 is too narrow for a modern GPU. I'm yet to see an M.2 slot that has more than 4 lanes, not to say it isn't possible just nobody seems to design that way. Doubt much will change as potential for a video card in that slot is limited at best. Maybe via some kind of header cable a custom card can be connected elsewhere in the case. This GPU either has no VRAM or 245MB GDDR3 so it is heavily depending on its link speed. Still an interesting concept but I really just don't see this being feasible, not even for digital signage as Intel iGPU is most likely more than sufficient without extra cost and cooling requirement.
 
Media center PCs are essentially dead. They've been replaced with Android boxes (Nvidia Shield) and Apple TV.



You'd be extremely lucky to get something as well developed and supported as Intel's HD/UHD graphics in there.

Well speak for yourself regarding media PC. It has always been a relative niche market but overall suckage of Android for that purpose hasn't changed for anyone wanting more advanced features/software. Android and Apple TV are for typical user who only consume streaming content really.
 
Media center PCs are essentially dead. They've been replaced with Android boxes (Nvidia Shield) and Apple TV.



You'd be extremely lucky to get something as well developed and supported as Intel's HD/UHD graphics in there.


While I agree, I think one application could be to add newer hardware video decode capability to an existing system.

If you are relying on Intel IGP at this point, you have to replace the CPU to upgrade to new video decode capability, and if you replace the CPU, due to them changing sockets every goddamned generation, you have to replace the motherboard. There is a real chance you may have to replace the RAM too.

Much easier/cheaper to slip in a super low end M.2 GPU.

But as you say, Media Center PC's are mostly dead unfortunately. I mean, I still use them, but I know I am in the significant minority.
 
While I agree, I think one application could be to add newer hardware video decode capability to an existing system.

If you are relying on Intel IGP at this point, you have to replace the CPU to upgrade to new video decode capability, and if you replace the CPU, due to them changing sockets every goddamned generation, you have to replace the motherboard. There is a real chance you may have to replace the RAM too.

Much easier/cheaper to slip in a super low end M.2 GPU.

But as you say, Media Center PC's are mostly dead unfortunately. I mean, I still use them, but I know I am in the significant minority.

But again, thanks to "paid exclusives," AND the much more strenuous certification process for playing back HDCP 2.2 content on an open platform like the PC, you're often going to be stuck with a new Intel chip or nothing.

Netflix 4k streaming was an Intel paid exclusive for six months.

4k BluRay playback IS STILL a paid exclusive for Intel, after 12 months.

Even after it's no-longer a paid exclusive, It's expensive to get your devices certified for each playback source. This is why Nvidia released their Netflix 4k drivers ONE YEAR before AMD, because they spent a lot more to move the cert process forward.

https://www.pcworld.com/article/319...-on-geforce-gtx-10-series-graphics-cards.html

https://www.trustedreviews.com/news/netflix-4-streaming-amd-gpu-drivers-download-3466303

Even though RX 480 and GTX 1080 were released within a month of each other, with the exact same HDCP and HDMI specifications, it took one platform twice as long to get certified. That screams money required.

I wouldn't expect this tiny company to pay very much for the cert process, so it's dead-in-the-water. If it can't play back legal content, there isn't much market for it.
 
Last edited:
This is likely 100% Digital signage related. So that you can run more than 2/3 displays off a single intel SOC solution or similar.

Will probably have next to zero actual acceleration and likely use the iGPU for nearly everything much like a USB2.0 or USB3.0 (to whatever you prefer) display adapter.

Just with better quality than the usb ones.
 
Back
Top