An occasional outlet for my thoughts on life, technology, motorcycles, backpacking, kayaking, skydiving...

Monday, March 31, 2008

Why are they putting VGA ports on modern computers?!?

I should start by saying, "I'm going crazy over this now." I'll just get that out of the way.

PC manufacturers are continuing to make computers (desktops, notebooks, ultra-portables, motherboards ATX thru SBC) with VGA ports on them. This is totally ridiculous to do. VGA is an analog output. DVI output is [basically] always actually DVI-I, which carries both analog and digital output. DVI-I can be converted to VGA with an inexpensive passive (no electrical components) adapter. In short, DVI can drive an analog display but VGA cannot drive a digital display.

Why does this matter? Because modern televisions are being made with high resolutions (1920x1080 or 1366×768/1368x768) and no VGA ports. This is perfectly reasonable because those high resolutions require tremendous bandwidth and the analog cables are very prone to interference at those rates. Another consideration is the cost of high resolution Analog to digital converts. It makes no sense to have your video source convert its digital data to an analog signal to be transfered over a highly vulnerable cable, only to be converted back to digital at its destination video display. This brings me to my second point.

All video cards/chipsets start with digital information. They process the information into a digital video signal. The signal is then passed to a D->A converter. Here is where things diverge. The device either:

  1. sends both the digital and analog signals out a DVI port.

  2. trashes the digital signal and sends the analog signal out a VGA port.

Clearly option B is a total waste, and results in the device being worthless for connecting to televisions for use as a PVR, or web surfing device. I am amazed that in 2008 it is still hard to find motherboards with DVI output and it's virtually impossible to search sites like NewEgg and TigerDirect for them.

My intention here is to explain that there is no good reason for putting VGA outputs (exclusively) on computers. I would like to see more people get upset about this so we can influence the market. If I have failed to persuade you, please post a rebuttal so I can revise my argument.


  1. Today I responded to someone on engadget who said: "VGA may be outdated, but it's hardly worthy of extinction. Anyone who is using a computer based off of this motherboard isn't going to be running a 1680 x 1050 resolution and playing games."

    VGA is 100% worthy of extinction. The idea of taking the result of the digital video that comes from the GPU, converting it to analog, and then trashing the digital video, is old and antiquated. DVI-I does 640x480 better than VGA. It doesn't become relevant at 1680 x 1050 or when video games are applied. It was relevant the moment it was invented.

    The video on that board, just before it reaches the connector, is like this: Imagine buying something with cash at Target. When they give you the receipt, it consists of both a cash receipt on top and a gift receipt on bottom. (That is the video in both digital and analog.) You have the ability to exchange that item at Target, give it as a gift, or return it for its full cash value and use the money at millions of other places. That is what you get with DVI-I. Now, if you tear the top off of that receipt and burn it, you are left with VGA. Can you see why those of us who are crying for DVI are frustrated? It's like having the cashier burn the receipt in front of you. It benefits nothing. It only destroys value and limits options.

  2. CRTs.

    In an effort to force CRT monitors off the market the company producing said monitors did not produce them including DVI or any other more modern connection method. Instead applying them to the LCD/Plasma displays as "brand new technology". (despite the fact that DVI had been available for quite a long time and CRT display is superior)
    Most people and businesses still use CRTs as do serious gamers and graphic designers. The vastly superior image quality available on a high-end CRT is still leaps and bounds ahead of the LCD/Plasma displays.
    I was quite distressed when my video cards stopped coming with VGA ports and am most glad to have acquired a converter. Unless you can manage to force the companies producing displays to keep making high end CRTs and modernizing their connectors, the VGA connection still has a life to go with those not willing to surrender their CRTs to the selfish marketing ploys just yet.

  3. @Alimas,

    I don't think you get it. If you read my article I clearly explain that DVI output can drive older VGA displays. Even old CGA and EGA monitors can be driven via DVI. Remember the old Green on Black and Amber on Black displays? Those can be driven from a DVI output if the monitor has a VGA input connector. The DVI to VGA adapter (or cable) has no electronic components. It simply changes the shape and ignores the wires that carry the digital signal. How can I be any more clear?

  4. Well, CRTs should've came with (and come with) DVI connectors for digital transference in the first place. But I give ye that a connector converter can be used.
    But the only reason its acceptable is because that effectively makes it backwards compatible with analog systems.

  5. I absolutely agree with what you are saying here. Modern computers should absolutely not have VGA ports. Not to sound like a fanboy (no, really) Apple trashed VGA years ago (with the last iBook G4 I think.... It had mini-VGA). I did just want to point out one teeny little thing. There really isn't anything wrong with using analog signals for transmission of video. Comparing an HDMI cable to good quality Component, I can't tell the difference. At home, I have my media devices about 50 feet away from the television, and as such I could either grab 50 feet of Component (cheap - and with runs up to 200ft) or 50 feet of DVI (really expensive and actually error prone over 15 feet) and I couldn't be happier.

  6. I understand that VGA may not be the latest and greatest, but I do believe that VGA plays an important role. As someone else mentioned, CRT monitors use VGA, and there is a very good reason for this: They are analog devices. As for the error correction, that would lead to more issues, such as lower response times. For people on a budget, VGA can be just the ticket. One, the monitors that include DVI capabilities generally cost a fair amount more for similar build quality, and two, there is no need to purchase an adapter to make to connector compatible with the cheaper monitor.

    While I am surprised that VGA is still as common as it is, I try to look at it in a different light: Think about floppies. Sometimes, nothing beats the simplicity of booting from a floppy. While the BIOS on many computers needs to be configured to boot from a CD, or at least to press a certain key to enter the boot menu, the BIOS on very many computers is already configured to boot from a floppy drive, if there is a floppy in the drive (and a floppy drive is present, of course).

    Really, I do think there need to be more integrated DVI options to choose from, but I do not feel the need to make VGA disappear (yet).

  7. I don't know how many times I'm going to have to address this first one...
    "CRT monitors use VGA: They are analog devices."
    - Yes, and DVI-I supports 100% of the analog devices with a <$2 adapter.
    "As for the error correction, that would lead to more issues, such as lower response times."
    - Gigabit Ethernet has error correction and I don't hear anyone complaining about response times.
    "monitors that include DVI capabilities generally cost a fair amount more"
    - NewEgg has 1 CRT for sale. LCD can only display a digital signal. If it has a VGA input it is running it through a A/D converter, and those cost extra money.
    "there is no need to purchase an adapter to make to connector compatible with the cheaper monitor."
    - I have found these adapters for <$2.

    So again, the only benefit of having a VGA port on a brand new netbook, notebook, or *-ATX/ITX motherboard is to save $2 for the people who buy old CRTs at GoodWill. No one has managed to successfully argue in support of <1% of all monitors sold getting to dictate the connector on >80% of all motherboards. It does not make sense.

  8. "Gigabit Ethernet has error correction and I don’t hear anyone complaining about response times."
    Response time is different than bandwidth. The bandwidth is awesome, but the overall response time is lesser.
    "If it has a VGA input it is running it through a A/D converter, and those cost extra money."
    The first part is true. The second part, however, seems to be false. The cheapest (new) monitor I could find that includes DVI is $20 more than its VGA-only counterpart. ($80 -vs- $100)

  9. could be done without latency. When the packets with CRC errors are retransmitted you are going to get latency and subsequently worse response times. The protocol could allow for choosing to correct errors or not. The latter option would cause you to experience degraded video quality. Picture quality vs. response time would be a trade off which the consumer could choose. (Cheaper receivers could omit error checking all together.) Ultimately, the way to avoid both penalties is bypassing the source of interference or using better shielding. Sometimes when fishing cables you can't avoid it, so compensation would be nice. Again, let us remember that DVI does not include error correction, so we are just playing the "what if" game. What DVI-I (The most common form of DVI output) does include is the analog signal required to run old, or budget VGA displays.

  10. New light:
    I was just reading some articles about video signal handling technologies.

    I say, "Ditch VGA, DVI, S-Video, HDMI, RCA (composite and component); bring on DisplayPort with a nice big, widespread, strong-willed introduction on every new motherboard that can handle it."

    Really, I think the continued existence of VGA is more that it would bog down some lower-end motherboards to use DVI to its fullest, so the manufacturers simply force people to use VGA. Unfortunately, this tactic also happens on boards that would do fine with VGA, and that does - at least somewhat - confuse me.

  11. Okay computer_freak_8, DisplayPort is a whole other monster. It is a distinct upgrade from current tech. That also clouds my point. Sure we would all like upgrades. I really can't make my point any more clear. Also, no motherboard that has VGA would be bogged down with DVI. DVI is less intense that VGA because the digital signal that comes out of the GPU does not need to be rather through a D/A converter. It could be kept as only digital and called DVI-D. But, add an additional D/A process and you get DVI-I. Then say "screw you consumer, I'm not putting a DVI connector on the thing" and then you get VGA. All it is is a connector. If you had good schematics and a steady hand, you could probably solder a DVI-I connection to most of these motherboards. Most SBCs have headers on them for DVI even if they have VGA ports.

  12. Oops. Yes, I made a mistake. Apparently, I misread the information I was looking at before I made my last comment. (I was under the impression (falsely) that VGA's top resolution was lower than the top resolution of DVI.)

    I want to make it clear that I believe you do have an excellent point, (it is pointless for motherboards to include only VGA connectors,) but I also believe that there are more options than just trashing VGA connectors from motherboards completely.

  13. The video card on my desktop has two DVI ports, my CRT monitor has one VGA port. No problem connecting the two (thanks to one of the two adapters which came free with the video card). My portable computer has a VGA port, no problem using it on a daily basis with a video projector which only has a VGA port; but I suppose that particular projector, if not all, could be used just as well if my notebook had only a DVI port and I used the same VGA cable and a DVI adapter.
    Why the hell do they still put VGA ports on new computers?
    I read somewhere that the quality of the image you get if you use DVI is up to twenty per cent better than what you get with VGA.

  14. All you need to do to become a network administrator is to obtain a certification in the field. Read Reviews

  15. To date, billions of people use computers with the internet every day.Best computers for college 2019