Matrox D1450 Graphics Card Now Shipping

erek

[H]F Junkie
Joined
Dec 19, 2005
Messages
10,875
"D-Series includes the robust and field-proven Matrox PowerDesk desktop management software. Users can select from a variety of advanced tools—including stretched or independent desktops, clone mode, pivot, bezel management, edge overlap, and more—to easily configure and customize multi-display setups. The feature-rich Matrox MuraControl video wall management software meanwhile provides users with an intuitive platform to manage video wall sources and layouts either locally or remotely, and in real time. Matrox video wall APIs, SDKs, and libraries are also available for developers and AV installers interested in creating custom control functions and applications.

"Video walls don't have to be difficult, and the Matrox D1450 graphics card is a perfect example of how we are constantly aiming to make designs and deployments easier for the customer," said Fadhl Al-Bayaty, business development manager, Matrox. "Having a single-slot, multi-4K-capable card with full-size HDMI connectors provides OEMs and system integrators with a new found flexibility to reach new levels of scalability and convenience from a single video wall processor. We're excited to see our video wall customers take advantage of D1450 in their upcoming installations."
Availability

The Matrox D-Series D1450 quad-monitor HDMI graphics card (part number: D1450-E4GB) is now shipping worldwide."


https://www.techpowerup.com/270507/...-high-density-output-video-walls-now-shipping
 
Matrox made a stink in the 2000s with their Parhelia that was supposed to crush Geforce and early Radeons, especially at DX9. It didn't, and did the opposite, and Matrix retreated from the gpu market ever since. They make money now with display technology, more B2B then consumer.
 
matrox who? no, seriously

We used their Video Editing cards for 15 years. Had the RT2000, RT2500, RT.X100, and RT.X2 HD.
IMG_2408.JPG


And my RT2000 setup waaay back in the day,
edit1.jpg
 
Matrox made a stink in the 2000s with their Parhelia that was supposed to crush Geforce and early Radeons, especially at DX9. It didn't, and did the opposite, and Matrix retreated from the gpu market ever since. They make money now with display technology, more B2B then consumer.
Unrivaled 2D quality, though.
 
Matrox made a stink in the 2000s with their Parhelia that was supposed to crush Geforce and early Radeons, especially at DX9. It didn't, and did the opposite, and Matrix retreated from the gpu market ever since. They make money now with display technology, more B2B then consumer.

The Parhelia was definitely hyped, and before release it looked like there was every reason to do so. At four pixel pipes with four TMUs each, it was set to be a brute. The rest of the DirectX 8-compliant hardware also looked respectable - pixel shader flexibility was better than Nvidia's but lower than ATi/AMD's, vertex shader capability was allegedly DirectX 9-level, it had some very nice antialiasing options for both 3D and text, and the hardware was capable of anisotropic filtering (though the driver restricted it to 2x...). Unfortunately, it shipped with no hardware memory controller, which meant that aside from fast z-clear and texture compression support it had no bandwidth saving capability on the backend, which means its memory bandwidth was squandered on overdraw and unable to keep that 4x4 pipeline config reliably fed. The chip itself was also buggy, with problems in the vertex shader that kept it from hitting DX9 compliance and precision issues that kept the OpenGL driver from ever being competitive. Heck, if I remember correctly there was a 10% variance in the clockspeed of shipping cards depending on how that particular chip managed the unit test, which would never happen now.

They kept using that Parhelia core for years, though after a point they leaned hard on the cut down P650/P750, which was basically the expensive-to-manufacture Parhelia but cut in half to 2 pipelines with 2 TMUs, because Matrox ceded any chance at prioritizing 3D speed after the Parhelia's failure. That core was eventually fixed up enough to attain Direct3D 9 compliance so Matrox's cards could have drivers capable of supporting WDDM with the Matrox M series; from testing an M9120 I'll say that the Direct3D 9 support works, but it's slow, and the OpenGL is still deeply questionable. They're working with AMD, and now Nvidia, because there's no way their hardware design team could make a GPU competent by today's standards. But they're still alive!
 
Last edited:
Yeah, consumers are tough to please, and the competition stiff. Business is easier because you can take the time to do everything right and charge for it.
 
I think I still have a Matrox G200 in a box somewhere. Nice to see they still exist.
 
I love seeing their name in a headline again. I remember having a G200.
 
Matrox made a stink in the 2000s with their Parhelia that was supposed to crush Geforce and early Radeons, especially at DX9. It didn't, and did the opposite, and Matrix retreated from the gpu market ever since. They make money now with display technology, more B2B then consumer.
True that the Parhelia was overhyped, overpriced, and too late to market. However, what preceded it wasn't bad at all. The Millennium was a damn good 2D card in its day. Pretty much everyone else in the mid to late 90s had trash 2D and shit RAMDACs. Matrox was your go-to for driving a quality analog signal (with proper cables) to your CRT (assuming it was worthy). There used to be a market for high end analog outputs (VGA->BNC inputs, etc.), although most of my friends couldn't afford a decent CRT so they could have cared less about quality. But I sure did. Now with digital, it's all taken for granted.

The G400 series was also pretty good. I still have my Millennium II and G400 Max + Marvel G400-TV. I have a lot of fond memories of those cards. Gaming on a G400 was just as good as the TNT2, although Matrox screwed the pooch at launch on OpenGL support. However, they fixed that months later with their MiniGL.

I went from SLI Voodoo 2s (absolute trash analog outs especially with the passthrough) to a G400 MAX and eventually retired that once the GeForce 1 showed up. Even the first and second gen GeForce's had questionable RAMDACs for driving my then aging CRT. Once LCDs, thus DVI, showed up, none of that mattered anymore.

I still have a soft spot for Matrox, but today it's pure nostalgia. They never had it in them to compete in the gaming segment. Now you'll only find them in esoteric places such as hospitals, airports, large corporate buildings, etc. Yeah, they are still around, but just barely.
 
I remember owning a G200 back in its day, and was actually quite pleased with the picture quality, along with its performance on my K6-2 300 CPU. I remember being able to play Heretic II, with its specific optimization for the G200, and being satisfied with the frame rate, along with enjoying the picture quality on my 17" Trinitron CRT.

Unfortunately, everything they put out after that didn't impress me that much, since there were better options from ATi and NVIDIA for less money, especially when it came to the Parhelia / Parfailia.

I simply don't see Matrox ever becoming a mainstream GPU again, since they can't really offer anything cheap to compete with integrated video, and that the gaming market is already saturated by AMD and NVIDIA. They're going to be limited to a very specific niche market, and it's a miniscule one at best.
 
Matrox also had the TripleHead2Go in 2006.

I never had one but I do recall people having pretty sweet racing setups and the like years before Eyefinity and Surround became a thing.
 
Back
Top