Innodisk Launches 4K M.2 Graphics Card

AlphaAtlas

[H]ard|Gawd
Staff member
Joined
Mar 3, 2018
Messages
1,713
According to the Japanese site PCWatch, Innodisk launched the "industry's first industrial M.2 connected video card" yesterday. While the product doesn't seem to be listed on their website yet, Innodisk currently lists 2 older mPCIe GPUs, and unveiled the 4K M.2 card in a press release back in October. The site managed to get some shots of the diminutive GPU, and it appears to use a "top end" Silicon Motion SM768 chip. The SM768 product page says it supports HDMI, DVI, VGA, Displayport and LVDS interfaces, and has a "128-bit high performance graphics engine. Hardware acceleration of Bitblt, Stretch Blt, Line Draw, Polygon Fill, full ROP3". Software support for Windows 7, 8, 10, OSX, Android, and various Linux distros is also mentioned.

Display resolution supports up to 4K UHD (3840x2160@30p) or 2K/Full HD@60p. SM768 supports H.264/MJPEG hardware video decoding. There's also an option to embed 256MB DDR3 memory into a single 19x19mm package.
 
I guess this would be good for systems with no iGPU..?

I had this argument with one of my employees this morning... Its a first gen, first release product. It, itself, isnt too much to get worked up about. BUT imagine the possibilities.... I'm hoping this becomes the new standard replacement for mxm cards. Think of how much smaller desktops/laptops can get. Imagine a NUC with 1060/580 graphic capabilites, or a STX system with that same power. I'd buy and sell the hell out of these. You know, if it adopts, becomes main stream, and supported buy AMD/nVidia.
 
I had this argument with one of my employees this morning... Its a first gen, first release product. It, itself, isnt too much to get worked up about. BUT imagine the possibilities.... I'm hoping this becomes the new standard replacement for mxm cards. Think of how much smaller desktops/laptops can get. Imagine a NUC with 1060/580 graphic capabilites, or a STX system with that same power. I'd buy and sell the hell out of these. You know, if it adopts, becomes main stream, and supported buy AMD/nVidia.

I share your opinion however I hope a third runner competes with NVidia and AMD. Intel's move into discrete graphics I guess could be considered 3rd but I really am tired of them and their lack of progress from sandy bridge to now. But yes, the SFF possibilities as well as less power usage and heat will be amazing if this matures.
 
If there was a market one would surmise amd would be very competitive with the teensy tiny scalable cores
 
I have an ultra SFF PC with no expansion slots as an HTPC media player but it's limited to whatever DisplayPort/HDMI features that its Intel iGPU supported ~5 years ago, for example, no Dolby or DTS HD audio formats can bitstream through it, just legacy Dolby and DTS codecs. I'd hate to spend money replacing that PC since it's adequate for HDMI multi-channel PCM and my 1080p TV, but if I could drop something like this into an M.2 slot and then Dremel some holes to mount an HDMI port, I'd be set since I could likely get new HDMI spec from that.
 
Had no idea this was even possible! Still trying to figure out what markets would find this useful though. Certainly not for gaming...even on the low end.
 
Gotta remember that M.2 is small, and that GPUs suck power and produce heat- we're talking maybe low-end mobile GPU solutions. The more important parts will be driver support and output support, but I don't see much utility here as an add-in-card.

More likely to see systems built with this style of GPU in mind, for the purpose of adding video outputs beyond what can be supported by onboard solutions (usually three).
 
Gotta remember that M.2 is small, and that GPUs suck power and produce heat- we're talking maybe low-end mobile GPU solutions. The more important parts will be driver support and output support, but I don't see much utility here as an add-in-card.

More likely to see systems built with this style of GPU in mind, for the purpose of adding video outputs beyond what can be supported by onboard solutions (usually three).

Didn't mean for my hopes and dreams to come off as "I want it tomorrow" I understand improvements will be generational.

Also, no reason why a 6 pin power cable couldn't attach to the top of the nvme card.
 
This is pretty cool. I could think of uses for this!

How do you connect it to a monitor though? It would seem there would have to be a wire that goes to a connector somewhere. Custom mod a hole for it? Because I'd imagine if you already had a free PCIe slot, you wouldn't be using this...
 
Also, no reason why a 6 pin power cable couldn't attach to the top of the nvme card.

The power could be supplied, sure- but where does the heat go?

As I said, M.2 is just plain small. I don't think that you'd want to put anything into an M.2 slot that uses external power. By the time you had a cooling solution, you'd have been better off with an AIC.

[where this might work, not as a form factor, but as an adapted connectior- you'd use it like an internal Thunderbolt header for a card placed somewhere else...]
 
I think if you would need an external power connector your power circuitry would be too large for the form factor also the heat would be too much.
 
I think if you would need an external power connector your power circuitry would be too large for the form factor also the heat would be too much.


I think it's doable, but likely very expensive, and it would be difficult to dissipate the heat.

For the record, I have a 12v DC to DC PicoPSU that uses an external power brick in 4 of my builds (3 HTPC's and a router) that is this small:

IMG_20170331_213702.jpg

IN the grand scheme of things though, I agree. This is going to be for lightweight applications where you just need a monitor output for simple desktop use, not for 3D/Gaming applications.
 
This is going to be for lightweight applications where you just need a monitor output for simple desktop use, not for 3D/Gaming applications.

Biggest issue will be keeping up with the codecs and HDCP support, all of which will require hardware and driver development. Expect 4k HEVC decode, assuming that the format takes off at all, to likely take some time.
 
I think it's doable, but likely very expensive, and it would be difficult to dissipate the heat.

For the record, I have a 12v DC to DC PicoPSU that uses an external power brick in 4 of my builds (3 HTPC's and a router) that is this small:

View attachment 125484

IN the grand scheme of things though, I agree. This is going to be for lightweight applications where you just need a monitor output for simple desktop use, not for 3D/Gaming applications.

Holy shit. That's really cool!
 
デジタルサイネージや医療用画像処理端末、カジノゲーム機などの薄型軽量/高い拡張性を必要とする工業向けアプリケーションに好適としている。
Means: It's suitable for industrial applications that require thin and light form factor or high scalability, like digital signage, graphical processing for medical use and casino machines.
Yeah, not a product for us normal folks.
 
What a cool idea. Want more GPU, get some GPU. It's external GPU done right. Good job
 
Holy shit. That's really cool!

They sell them over at mini-box.com, both as individual units and as part of kits that include the power brick. They don't provide enough power for a enthusiast or gaming system, but for light weight stuff, they are tiny and very very efficient.

My i3-7100 based pfSense router built other one of these pulls 6w from the wall at idle.
 
This is pretty cool. I could think of uses for this!

How do you connect it to a monitor though? It would seem there would have to be a wire that goes to a connector somewhere. Custom mod a hole for it? Because I'd imagine if you already had a free PCIe slot, you wouldn't be using this...

The same way an apu does.
 
Hmm, didnt think that was possible, but it's just pci-e, so I guess that works...

Where do I stick my DVI connector though?
 
I have an ultra SFF PC with no expansion slots as an HTPC media player but it's limited to whatever DisplayPort/HDMI features that its Intel iGPU supported ~5 years ago, for example, no Dolby or DTS HD audio formats can bitstream through it, just legacy Dolby and DTS codecs. I'd hate to spend money replacing that PC since it's adequate for HDMI multi-channel PCM and my 1080p TV, but if I could drop something like this into an M.2 slot and then Dremel some holes to mount an HDMI port, I'd be set since I could likely get new HDMI spec from that.

This is pretty cool. I could think of uses for this!

How do you connect it to a monitor though? It would seem there would have to be a wire that goes to a connector somewhere. Custom mod a hole for it? Because I'd imagine if you already had a free PCIe slot, you wouldn't be using this...

You could pull the plastic from over the I/O ports, unscrew and pull or desolder the motherboard's video connectors, and run a cable from the card to where those were on the I/O shield. Cases built with this in mind would probably have a spot similar to where you'd mount an internal usb cable, so you wouldn't have to mod your board. I imagine a kiosk or other POS would be the primary usecase for something like this.

Edit: alternatively, a (software?) mux could send the signal to the monitor over usb(-c).
 
I think it's doable, but likely very expensive, and it would be difficult to dissipate the heat.

For the record, I have a 12v DC to DC PicoPSU that uses an external power brick in 4 of my builds (3 HTPC's and a router) that is this small:

View attachment 125484

IN the grand scheme of things though, I agree. This is going to be for lightweight applications where you just need a monitor output for simple desktop use, not for 3D/Gaming applications.

The pico psu is something like 92% efficient... Processors and gpus dump 100% of their power input to heat.

If your pulling 150w, the max for yours I believe, that’s on,y 12w of heat. A 6 pin supporting 75w at 75w of heat, is entirely different....

And I use a pico wide voltage. (12-18v or whatever it is) for my car mini itx system :) wonderful product, and one of their pico ups as well, going to an external sla 12v battery.
 
Gotta remember that M.2 is small, and that GPUs suck power and produce heat- we're talking maybe low-end mobile GPU solutions. The more important parts will be driver support and output support, but I don't see much utility here as an add-in-card.

More likely to see systems built with this style of GPU in mind, for the purpose of adding video outputs beyond what can be supported by onboard solutions (usually three).

Right, the max power supplied by m.2 slot is 7w. That's barely enough power for integrated graphics performance level (from a 15w CPU).

By the time you get enough performance increase fro these things, your laptop will be in need of a complete upgrade. Why waste dollars on a pointless video card, when it has the exact same performance as a new laptop with onboard?

And even if this were high-performance, it would a corner-case device, since there is no guaranteed way to get a graphics signal from your laptop's m.2 slot up to it's screen(you ned a path through the hinge).

These cards will be priced at a premium, just like MXM and just as clunky to get installed in a specific laptop.
 
Last edited:
i think you might see a market here for retro gaming and mostly hobbyists. It might be just enough for those uses. And a welcome alternative to the god awful onboard intel 500 for graphics.
 
i think you might see a market here for retro gaming and mostly hobbyists. It might be just enough for those uses. And a welcome alternative to the god awful onboard intel 500 for graphics.


Where exactly are you seeing a mainstream Apollo Lake system with an m.2 slot? Those are typically reserved for more expensive motherboards ($100 and up, SOCKETED CPU) and notebooks ($600 and up).

Even if you find one, Apollo Lake itself only has 6 PCIe 2.0 lanes. At least half of those will be reserved for other components on the motherboard, so you get at most two lanes. That's 1.0 GB/s of bandwidth (we're back in the AGP days)!

Not to mention they're all E-keyed (for wifi modules). Those are half the size of 4x lane m-keyed SSDs with pce x4 slot M, also used by the video card

https://arstechnica.com/gadgets/201...e-interface-that-will-speed-up-your-next-ssd/

IMG_0858-640x425.jpg


You can plainly see from the press release photo that these are m-keyed.

1. These things will only be compatible with more expensive motherboards, and more expensive notebooks.

2. These things will only see full performance on premium systems with full 4x PCIe 3.0 m.2 m-keyed slots, and those will always have faster onboard graphics.

This is only for corner cases. Thundrebolt 3 is a VASTLY more plausible upgrade path.
 
Last edited:
For media center PCs and laptops I could see this tech taking off. The M.2 slot would have to get faster, but if this becomes a thing I am sure that shouldn't be a problem.
 
For media center PCs and laptops I could see this tech taking off. The M.2 slot would have to get faster, but if this becomes a thing I am sure that shouldn't be a problem.


No, the M.2 slot is plenty fast enough. It's limited by the 7w power consumption.

You can't beat physics. You need a minimum size of graphics card if you want something more powerful than onboard. You have to keep a MINIMUM GAMING graphics card COOL, because they use at least 30w.

Laptops cheat on this by using heatpipes to cool the GPU, but the MXM cards they're attached to are still much larger than this. At best, you could double the power consumed without overheating (versus air cooling). 15w is less power than a MX 150 low-power version consumes, and you all know how shitty that is.
 
Last edited:
I was rockin dual Voodoo 2s back in the day. Also had the 3D glasses, they were horrible.
I've still got my glasses (and an older Orchid 3D voodoo 1)!
It was crazy seeing Doom 2 in 3D stereo for the first time.
They werent much use for anything else though.
 
According to the Japanese site PCWatch, Innodisk launched the "industry's first industrial M.2 connected video card" yesterday. While the product doesn't seem to be listed on their website yet, Innodisk currently lists 2 older mPCIe GPUs, and unveiled the 4K M.2 card in a press release back in October. The site managed to get some shots of the diminutive GPU, and it appears to use a "top end" Silicon Motion SM768 chip. The SM768 product page says it supports HDMI, DVI, VGA, Displayport and LVDS interfaces, and has a "128-bit high performance graphics engine. Hardware acceleration of Bitblt, Stretch Blt, Line Draw, Polygon Fill, full ROP3". Software support for Windows 7, 8, 10, OSX, Android, and various Linux distros is also mentioned.

Display resolution supports up to 4K UHD (3840x2160@30p) or 2K/Full HD@60p. SM768 supports H.264/MJPEG hardware video decoding. There's also an option to embed 256MB DDR3 memory into a single 19x19mm package.

Given the small area, highly limited bandwidth comparing to a normal PCIe slot as well as nearly non existent cooling, I'm not sure what practical use this would have. I doubt it will be much faster than Intel iGP if at all and likely will require special computer setup to sink the heat. Maybe it will be useful in some custom system running of Atom with an M.2 slot to boost graphics, either way an odd device. I suppose it's better to have more options than not...
 
Given the small area, highly limited bandwidth comparing to a normal PCIe slot as well as nearly non existent cooling, I'm not sure what practical use this would have. I doubt it will be much faster than Intel iGP if at all and likely will require special computer setup to sink the heat. Maybe it will be useful in some custom system running of Atom with an M.2 slot to boost graphics, either way an odd device. I suppose it's better to have more options than not...


Agreed with most of this, but I don't think the PCIe bandwidth will be of a huge concern.

Back in ~2010 for shits and giggles I built a custom enclosure and used an Expresscard to PCIe slot adapter on my work laptop and briefly ran a Radeon 6850 as an external GPU on it.

Sure, it lost a little bit of performance due to only being PCIe 1x, but not a ton. Less than 5%

Now, one could argue that a Radeon 6850 is a much older card and not representative of modern GPU bandwidth needs, but that was PCIe 1x (not sure if gen 1 or gen 2) and modern m.2 ports are up to 4x PCIe gen 3, so a modern m.2 slot can have 16-32 times more bandwidth.

I have a feeling that in single GPU configurations with VRAM on board, the PCIe bandwidth is really only relevant to level load times when textures are transferred and decompressed.

If you ahve insufficient VRAM, textures may be transferred live during game, but most of the time this is a level load task, I believe. (Though don't quote me, as I am not a subject matter expert here)
 
Last edited:
Back
Top