VGA In Memoriam

Thank goodness, that connector was error prone and unreliable, and a hassle to use at times.

Next in line is DVI.

We've been moving away from connectors with lots of pins with screws to secure the connection (VGA, DVI, serial, parallel, game port) to moderate force screw-less connectors (USB, HDMI, DisplayPort, Thunderbolt).
 
Still have a VGA port in my laptop which is used regularly unlike the HDMI port which I have never used (Dell Latitude E6420). It's still useful in many respects, just like an actual serial port is - while the vast majority may have absolutely no use for such aspects of computer hardware, a lot of us actually do.

So no, I'm not sad to see it go (because it's not going anywhere anytime soon), it will just continue to be used less and less.
 
Unfortunately, people in their hurry to get bigger, better and faster, tend to forget that sometimes you still need a component or tech...that just works.

I've been in situations where the more modern and 'better' connectors just wouldn't work or cut it.

VGA to the rescue. Like having a pencil and paper pad in a modern meeting. Doesn't need batteries, a password or the right light to work.
 
Thank goodness, that connector was error prone and unreliable, and a hassle to use at times.

Next in line is DVI.

We've been moving away from connectors with lots of pins with screws to secure the connection (VGA, DVI, serial, parallel, game port) to moderate force screw-less connectors (USB, HDMI, DisplayPort, Thunderbolt).


Yeah those "moderate force screw-less connectors" really hold up well in the real world.

HDMI is one of the most feeble connectors in history. Second only to SATA.
 
Yeah those "moderate force screw-less connectors" really hold up well in the real world.

HDMI is one of the most feeble connectors in history. Second only to SATA.

SATA I won't argue with but I've never once had an HDMI connector fall out by itself or break in any way. Both DVI and VGA on the other hand are prone to bent pins, stripped screws, stuck screws, screw terminals unscrewing off the back of the host with the stuck VGA screws, connector housings separating because the screws got stuck and the last person said the hell with it and removed the screw entirely, and so on....

The other arguments about "it just works in a pinch" and corporate settings with projectors are also both perfectly valid but there is no advantage to the VGA/DVI form factor itself.
 
I have to agree about HDMI being fragile. Usually it's an issue when there is tension on the cable pulling up or down on the HDMI cable in relation to how it's in the port. DVI and VGA can both usually handle a fair amount of tension on the table but HDMI is barely better than USB.

Of course, a lot of that has to do with the cable. Usually it's either because of a cheap cable with terrible build quality where the connector bends/breaks, or a really expensive cable with such thick cabling that it hardly even allows it to bend, putting extra strain on the connector until it bends/breaks.
 
SATA I won't argue with but I've never once had an HDMI connector fall out by itself or break in any way. Both DVI and VGA on the other hand are prone to bent pins, stripped screws, stuck screws, screw terminals unscrewing off the back of the host with the stuck VGA screws, connector housings separating because the screws got stuck and the last person said the hell with it and removed the screw entirely, and so on....

The other arguments about "it just works in a pinch" and corporate settings with projectors are also both perfectly valid but there is no advantage to the VGA/DVI form factor itself.

All those bad things you described happening to DVI and VGA connectors were not involved with "connector falling out by itself". They are the result of someone tripping on a cable, or using a cable to pull a rolling desk, or trying to get a cable to stretch that last god damn inch because they don't want to spend an hour repositioning the computer and monitor to make it work for reals.

In all of those cases HDMI would just pop out like a breakaway....or....what's really fun...get sheared off inside the connector.
 
Why not just use HDMI? It's effectively pin compatible with DVI and the plug for it comes in 3 sizes.

HDMI is pin compatible with basic DVI. If you're talking about Dual-Link DVI, you can't convert that to HDMI with just passive adapter or cable (unless you are okay with it reverting to the equivalent of basic DVI). Only recently did higher level versions of HDMI get to the point of being able to provide bandwidth equivalent to Dual-Link DVI. I have a 120hz monitor that is a few years old, and I can only do 120hz using Dual-Link DVI or Displayport. If HDMI was my only option I would be stuck with 60hz.
 
Any word on whether Pascal still retains a DAC for DVI-I output? VGA itself is nowhere near dead, but they're trying to hurry it along into its grave for better or worse.
 
I guess the way I feel is that video cards and systems tend to become old and obsolete, but I don't think I've thrown out a monitor or projector in 12 years

They keep getting repurposed, because they still work and there is nothing technically wrong with the old ones.

Because of this, the need for connectors like DVI and VGA in snt going anywhere.
 
I wish it would die, but my most current motherboard, ASUS Z10PE-D8 WS has a on board VGA port! WTF!

Thank God the other day when I walked past my neighbour, he had an old Dell VGA screen on the street. Free ;-)
 
That makes no sense at all, VGA can't handle 4K and have problems with 1080p so no reason at all for it to be on a 4K monitor.

I use a VGA connected monitor at 2048x1536 and it works just fine
 
So many users, so little reading of the article.

OK, I read the article just now: there are multiple statements of VGA being hard/impossible to find on a current motherboard or no new systems with VGA. Both wrong statements. Short of the neat part of the history of VGA, sounded like a pure opinion piece. I can assure the universe that VGA isn't dead or going away this year.

Or am I not supposed to have noticed those comments? :confused:

I guess this "article" is like those "PCI dies this year" articles that pop up every so often. :D Guess I didn't need to read the actual article, the blurb the OP posted told me what I needed to know.
 
Absolutely. VGA was the PC game (no pun intended) changer. For the first time the clones finally had as good/better then Atari/Amiga GFX. And they never looked back.

As far as connecting goes there will always be super cheap adapters.

Uh, I wouldn't go that far. The Amiga had hardware 2D graphics acceleration (e.g. hardware blitter and sprites), something that the PCs wouldn't get until the 90s with the "Windows Accelerator" boards (which were never really supported under DOS making it moot for most games).
 
Absolutely. VGA was the PC game (no pun intended) changer. For the first time the clones finally had as good/better then Atari/Amiga GFX. And they never looked back.

As far as connecting goes there will always be super cheap adapters.

Uh, I wouldn't go that far. The Amiga had hardware 2D graphics acceleration (e.g. hardware blitter and sprites), something that the PCs wouldn't get until the 90s with the "Windows Accelerator" boards (which were never really supported under DOS making it moot for most games).


Fully agree with damicatz.

The level of hardware acceleration (both for graphics, and for sound) in the Amiga made it a superior platform, IMHO, up until the advent of Direct3D.

The launch of the 3DFX Voodoo 1 was what finally made PC's a superior gaming platform.

In the early to mid 90's the Amiga 1200 performed WAY better in games than my 286 and later 486sx25 (overclocked to 50Mhz, try getting a 100% overclock today :p ) and even the venerable Amiga 500 still looked good.

Yes, once VGA was launched, the PC was technically able to produce more colors out of a greater palette than the Amiga 500 could, and once SVGA came around, the PC had the advantage in this regard compared to the Amiga 1200 as well, but it was just something about the hardware acceleration on those machines that made the gameplay experience more fluid and superior at the time.

Growing up in Sweden where the Amiga was VERY popular, I was - in the early 90's in elementary and middle school one of the few PC owners I knew. Most of my friends had Amiga 500's and later Amiga 1200's, and they had both more titles and the titels that were cross platform played better on them, IMHO.

The only thing I didn't like about the Amiga's back then was that most people didn't get monitors for them, and just plugged them in to old tube TV's which made the graphics less crisp, and the colors a bit washed out and pastel-like.
 
That was actually a really fun writeup, thanks for sharing.
RIP VGA! :cool:
 
I automate data center deployments and the one thing every single server has is a VGA port and I don't think that's going away anytime soon in that environment. It's pretty important when its time to bust out the crash cart for troubleshooting or lack OOB management for whatever reason.
 
As for Servers, that's a problem, since all our servers are using VGA through a switch box. If I need to buy a server in the next couple years, I'd likely buy an older model if that was the only way I could get one with a VGA port.

I still have LCD panels in use that are VGA only. I'd hate to have to replace them because they stop putting VGA ports on desktops.

The hell? Upgrade your switch box? Buy $2 VGA to DVI adapters?

It's people like you who are unwilling to move on from old tech that's caused the hardware stagnation we've seen since the days of XP, even more so than the lack of demanding software.
 
The hell? Upgrade your switch box? Buy $2 VGA to DVI adapters?

It's people like you who are unwilling to move on from old tech that's caused the hardware stagnation we've seen since the days of XP, even more so than the lack of demanding software.

No VGA means to analog/DAC, which is what your DVI-I adapter would require to output to VGA, which is going away, meaning DVI-D only. (assuming we are talking about future systems which no longer have DVI-I outputs)
 
The hell? Upgrade your switch box? Buy $2 VGA to DVI adapters?

The cheap ones only work on DVI-I (up until now the most common) and DVI-A (which I have never seen).

My Gigabyte AMD FM2 motherboard (equipped with an A10-7850) however, doesn't ahve a VGA port, and the DVI port is a DVI-D, so the adapter simply does not fit into it.

You'd need an active VGA adapter with a complete DAC in it for that to work. I've never gone shopping for one, but I am sure they exist. Probably much more expensive though.
 
Zarathustra[H];1042123547 said:
The cheap ones only work on DVI-I (up until now the most common) and DVI-A (which I have never seen).

My Gigabyte AMD FM2 motherboard (equipped with an A10-7850) however, doesn't ahve a VGA port, and the DVI port is a DVI-D, so the adapter simply does not fit into it.

You'd need an active VGA adapter with a complete DAC in it for that to work. I've never gone shopping for one, but I am sure they exist. Probably much more expensive though.

10 bucks breaks the bank?

http://www.amazon.com/Rankie®-Gold-..._UL160_SR160,160_&refRID=1Q6F7A5Z7X4F5TNT4V28
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
Thank goodness, that connector was error prone and unreliable, and a hassle to use at times.

Next in line is DVI.

We've been moving away from connectors with lots of pins with screws to secure the connection (VGA, DVI, serial, parallel, game port) to moderate force screw-less connectors (USB, HDMI, DisplayPort, Thunderbolt).
I hope dvi sticks around for a very long time. DP/hdmi ports are generally of shitty quality, cables and ports break easily. People confuse the ports for usb and try to force their usb type-a cables into them, breaking them..

I manage over 2k computers between 2 sites, and having seen the changes from vga+dvi to vga+DP, and the problems it brings, I would rather keep vga+dvi in the business world where low/mid range hardware is used, by _users_ ...
 
Erm, I saw this on Hackaday's RSS feed and VGA isn't dead or anything, so...

A bit redundant, I suppose the Front Page News forum is becoming an aggregator in the same likeness as say, Buzzfeed or something. :confused:
 
Most of the servers I work on have VGA - it's simple, cheap, and very little overhead on the system. I could see this going to DVI, but then look at all the KVMs that would need replaced.....

who uses KVM's?
 
who uses KVM's?

People who haven't figured out how to remote manage their servers?

I usually only have screen/keyboard booked up when I need to install OS, get into BIOS etc, and only then if the server lavks IPMI.

Once set up I manage everything via SSH.
 
who uses KVM's?

when things go wrong and it's easier to just walk up to the server... you could use IPMI, but you lose 1U for a LCD/keyboard/KVM combo that's always there...
 
I'd be happy so see it go, but it is still used a lot in servers. Most KVM's are VGA, but DVI ones are getting more popular, funnily as DVI is becoming obsolete.
 
I use it all the time... Either for servers all the time, or for projectors on my laptop. I see people fumbling with the newer hdmi/displayport connectors (not even mentioning HDMI comes in full, mini, and micro which depending on the device, I've seem all 3 used, DP with full and micro as well) which are getting better, but all to often I watch people spend 5 minutes getting their screen working. VGA, always seems to be plug and go. Even my newish T450s has VGA for I presume this reason. That said it will slowly be phased out at this point.
 
Yeah, to me, the only reason to keep any analogue standard these days is for convenience of setup.

I have old small screams I use for building rigs, and tests and srtups, before I hook everything up.

I would never use VGA on a system I actually have to spend any amount of time looking at. The quality of going through a DAC and back to digital again is just too horrendous, and I haven't owned a CRT in over 10 years...
 
lol most places will just switch to adapters. But they wont get rid of the VGA.

The sad thing is it would be nice if they would switch to mini DP but I know that most will just stick with HDMI. Where I work that already happened after 10 years of sticking to VGA the new projector installs went in with HDMI and no DP. (the projectors themselves have DP but they just didn't bother running the cables).
 
I don't see how its dead, theres still a few Skylake motherboards out there with VGA...

http://www.asus.com/us/Motherboards/B150M-A/

If this just means for laptops, well, VGA is just too big of a connector for all these slim notebooks.

This was going to be my point. Any motherboards sold in emerging markets pretty much have to have it according to the guys at GIGABYTE and ASUS. If they don't have it, they don't sell in those markets.
 
This was going to be my point. Any motherboards sold in emerging markets pretty much have to have it according to the guys at GIGABYTE and ASUS. If they don't have it, they don't sell in those markets.

I guess Gigabyte wasn't planning on selling the GA-F2A85XN-WIFI in emerging markets :p

As you can see below, it only has DVI-D and HDMI ports, which no passive VGA adapters will work with.

8293_big.jpg


277px-DVI_Connector_Types-r.svg.png


Was a major annoyance when I was first building the system to serve as my HTPC, and the only screen I had down there to set it up with was an old free VGA based 1024x768 LCD.
 
Good history read. I remember reading about monochrome, CGA, EGA as precursors to VGA, but I hadn't realized the PC graphics situation was quite so much of a mess back in the 80s. Especially compared to it's competition. VGA standard was a big step forward.
 
when things go wrong and it's easier to just walk up to the server... you could use IPMI, but you lose 1U for a LCD/keyboard/KVM combo that's always there...

That's what iDRACs and ILOs are for.
 
Back
Top