33

Earlier connectors for devices attached to personal computers tend to be very bulky. A lot of them used the D-Sub connectors, keyboards and mice used the DIN and later mini-DIN connectors. Even the relatively new DVI connector was big and bulky and had to rely on side screws to stay attached.

In contrast, today we use the much smaller connectors of HDMI, USB and DisplayPort. Each with their own variations, but as a rule small, easy to use, and no screws involved. What changed that made this possible (or why else wasn't it done earlier)?

(The DVI connector is particularly perplexing, because it carries the same signal as HDMI, so it's not as if the communication technology improved there. USB was also already a thing with its smaller and more convenient connectors. It should have been possible to make a connector similar to HDMI right from the start)

17
  • 8
    Higher chip frequency made high-bandwidth serial connection possible (and then necessary), so connectors could have fewer pins. Also, the market for portable devices gradually became large enough for dedicated connectors to be made. Before that, most computer connectors were existing industry-standard connectors like you mentioned.
    – DL444
    Commented Jul 11 at 9:10
  • 20
    Given the typical lifespan of, for example Micro-USB receptacles, I'd rather ask the opposite question...
    – tofro
    Commented Jul 11 at 10:39
  • 16
    Ironically, the "sub" in "D-Sub" is short for subminiature. Wikipedia says, "When they were introduced, D-subs were among the smallest connectors used on computer systems." Commented Jul 11 at 13:03
  • 10
    Completely anecdotal, but DVI and older connectors don't need screws to keep them in, it just ensures they stay in when accidentally pulled. Back in university my friend's PC was the only one out of a house of 5 that didn't get stolen when they had a break-in - the monitor was hanging off the desk by the VGA lead, it had been picked up and then left. The thieves didn't want to spend the time unscrewing either end so just left it.
    – Matt Lacey
    Commented Jul 11 at 14:57
  • 5
    We certainly did not call them "bulky" at the time! Commented Jul 11 at 19:16

8 Answers 8

53

TL;DR: Progress

  • Most important: Volume
  • Second: Reduced Variation
  • Third: Both Sides Need to Agree
  • And Last: Changed Technology
  • .. oh, and again: Volume

In 1970 a computer selling 100 units was a success. By 1980 selling 20.000 of a single type was huge. Today a laptop model selling only a million units is a Flop. And while the market in 1980 was already past a million units per year, each manufacturer had it's own interfaces, especially for video - if not integrated, or present at all.

A Sub-D connector can be soldered without any tool, using manual labor. No matter if PCB or cable. Perfect for any volume, especially a low starting volume. At the same time it can be machine manufactured (wave soldering etc. when production ramps up. And up to the mid 1980s all micro computer manufacturers were rather on the start-up side.

'A' small connector is of no good as long as every manufacturer uses a different. It wasn't until the 1990s that the 1987 introduced VGA became a (mostly *1) accepted standard. There can't be a standard connector without a widely accepted interface standard. For components this meant that only a standard connector allowing variations, like D-Sub, will gather enough customers to drive the price down, making it more attractive than any nice small but special one produced in low volume.

Next, new, smaller connectors can only be added if both sides agree, computer and printer / modem / keyboard / screen / etc. Easy in a closed situation, like Apple showed with some of their machines (*2), next to impossible in an open environment, where it's for manufacturers important to offer interfaces compatible to as many different computers or peripherals as possible.

This only changed somewhat when one platform, the PC, acquired an overwhelming market share, making most manufacturers of peripherals using its interfaces by default. Still it was a 10+ years fight to even get those seemingly clear defined standard interfaces really working the same on all devices (*3).

Last, those smaller connectors need different mounting. A D-Sub can be mounted in thru hole style. Its pins not only neatly fit the 2.54 mm grid of 1980s PCB technology, but also gives it a stable mechanical mounting. Smaller connectors mean more dense pin arrangement than 2.54 mm, something that did not fit the existing technology (*4). To fit a connector it would need a sub-PCB to break out the connections to 2.54 mm standard. Only introduction of SMT would allow to fit such small devices, except by 1986, when most PC interfaces were already defined, SMT was cutting edge and only used in less than 10% of all PC manufacturing.

It took a good decade to roll out SMT across the industry enabling the use of smaller connectors without notable increase of cost. Exactly the time when smaller connectors became to be considered and introduced. Nice, isn't it :)


Oh, and there is one more, one trumping all others, but easy ignored when just looking at technology:

Why At All Would One Need a Smaller Connector?

I mean, why would someone need smaller connectors than D-Sub? A PC is installed once and done. Maybe upgraded after 3-4 years with more memory or a better screen, but that's it. It's important that all connectors stay in place, not how big they are. They are at the rear anyway and even hardware nerds don't replug their devices every other day.

Modern laptops (that is, slim line *5), tablets and phones have different requirements. They get unplugged and moved all the time, used in different configurations and offer at the same time way smaller rims. Here easy pluggin and smaller size is helpful (*6).

But they are a development of the naughties. Before that, there was no real need for slim line connectors like USB or HDMI on computers.


The Details:

Earlier connectors for devices attached to personal computers tend to be very bulky.

Are they? I'd rather say they are a marvel of compactness compared with what was before. D in D-Sub stands for the D-like shape, while Sub means Subminiature - as it was considered extreme small when introduced in 1952.

D-Sub and Mini-DIN were the tiny connectors of their time.

When early computers started they neither had the investment nor the volume to create new connectors they used what was there and D-Sub was a versatile standardized choice for them. Not to mention that those early micros were bulky themself.

In contrast, today we use the much smaller connectors of HDMI, USB and DisplayPort. Each with their own variations, but as a rule small, easy to use, and no screws involved.

Well, this "Each with their own variations" is the second major difference:

Each of the 'new' ones are single purpose connectors.

D-Sub and (Mini-)DIN (*7) are general purpose connectors with no specified use case but connecting. They universal connectors, meant to be used in cases not already defined when they were designed - as well as open to adaption. Adaption is what made D-Sub ad Mini-DIN the main choice for all around usage, the PC being the best example:

  • DB-25 for parallel
  • DA-15 for Joysticks
  • DE-15 for VGA and
  • DE-9 for serial

All using the same technology, tools and supply chain, but vastly different use cases (*8).

All the 'new' ones are special to type, created for one use and one use only, with a fixed layout, fixed signalling and no customisation intended.

What changed that made this possible (or why else wasn't it done earlier)?

First of all

(The DVI connector is particularly perplexing, because it carries the same signal as HDMI, so it's not as if the communication technology improved there.

Not really, as DVI also (can) carry analogue, VGA type video signals. And that's important as DVI was a bridge technology to allow a smooth transition between analogue transferred VGA and doing the same digital (*9). The same video card could be used to drive analogue or digital screens, while either screen could also use the same cable, independent of transport used.

USB was also already a thing with its smaller and more convenient connectors. It should have been possible to make a connector similar to HDMI right from the start)

Sure, but they would require more changes in infrastructure (computers, graphic cards screens and cables) at once, an upfront high risk investment no diverse industry likes to jump on. USB only worked as it was a new device with a new use case, offered as add on, not replacement - just remember how long PC's still carried PS/2 for more than a decade in parallel even when USB HID were already well established.

The slow switch for a smaller USB connector does in fact make a perfect example for how hard it is to establish a new connector for an existing interface. New computers and even tablets still carry in 2024 the 1994 USB-A socket, despite Mini being introduced in 2000 and micro in 2007 (*10). Only he 2014 USB-C does make some ground ... 10 years after introduction.

DVI had the same issue and was made to cover it by creating as much compatibility as possible. DVI

  • fits thru hole mounting,
  • fits existing connector mounting,
  • offers the same mechanical stability,
  • (can) avoid unintended disconnection,
  • does not need new handling in production,
  • can be used in the same environments as VGA.

All points that are important providing a smooth transition between VGA and DVI. DVI was all gain no loss(*11).

HDMI in turn

  • does need SMD mounting,
  • needs new mounting
  • is way less mechanical stable
  • low resistance against unintended disconnection
  • can only be used in moderate conditions.

In fact, HDMI wouldn't be much used with computers if it wasn't made as a consumer standard for TV, build on DVI standards. Consumer electronics do not only have way less strict requirements for reliability but most important extrem high sales volume, thus making all parts dirt cheap. As a result HDMI was

  • compatible with DVI,
  • allowing direct connections between DVI cards and HDMI displays,
  • allowed (to some degree) use of HDMI source for DVI displays,
  • reduced production cost.

For PC manufacturers this meant as first step DVI to HDMI cables were all what's needed to reach more display solutions (aka TV's), while display manufacturers could make their displays fit both worlds. Being compatible meant that makers of devices with restricted space could use HDMI connectors while still connecting to DVI screens. With enough wide spread HDMI displays available (aka sold as TV), it made sense for all PC manufacturers to add HDMI instead of DVI. All of that 'good enough' for average office installations. So the next smooth transition happened.

Bottom line: It's a story of progress quite similar as from bulky pre WW1 cars to today's low profile cars (*12).


*1 - Non-PC, like workstations (think Next) or Apple still used their own designs way into the 2000s. Apple even with their own version of DVI, the ADC, essentially the same, but different connector, requiring a USD 150 cable to connect :)

*2 - Just think ADB or the square SCSI connector used by some powerbooks. Of course increasing the price tag for those devices as well as for anyone else who wants to connect.

*3 - Some may still remember the hassles a seemingly simple 'standard' prot as the PC's parallel port gave us in the 1990s.

*4 - Some may remember the drive of Japanese companies for a 2 mm grid in the late 1980s as an attempt to increase THT density and pin count.

*5 - An IBM Portable PC, Osborn or Compaq was as big as a desktop PC, plenty room for D_sub. Likewise next to all laptop from early Toshiba until the late 1990s early 2000s. Only after that a full figured PC could be put in a case smaller than several cm.

*6 - Also a reason why USB-C is taking over from the bulky HDMI connector :))

*7 - Mini-DIN already being a shrink from DIN connectors for convenience.

*8 - At that point it's important to keep in mind that seemingly default assignments used by the PC is an IBM specific definition made exclusive for the PC. Other computers had other uses for the same connectors and/or other connector variations.

*9 - Also the reason why the digital protocol is still tied to video timing.

*10 - Does anyone remember those Super-Speed monsters?

*11 - Note that Mini-DVI and Micro-DVI are custom connectors made by Apple for their laptops, not any standard development.

*12 - Pre-SUV era that is.

16
  • 4
    Beautiful! You win! :)
    – Vilx-
    Commented Jul 11 at 17:00
  • 3
    @gidds Not really. For one, Apple was a total niche manufacturer at the time and only really present in the US while PC development was already driven from Asia (where Apple computer wer virtually non existent). More important, PC's already adopted USB in 1996 with Intels 440FX chipset for PentiumPro and Pentium II CPUs. A good two years before Apple added USB with it's G3 iMacs. It was more of a cost cutting measure. The 'real' Macs got FireWire as main serial interface. The only ones forced where iMac users who had to buy everything new - including a hub as 2 USB wasn't much.
    – Raffzahn
    Commented Jul 11 at 23:13
  • 2
    And those "small and modern connectors" were often quite crappy. I don't know how many times I've had problems with HDMI connectors due to bad contact or uncommanded unplugging. Let's not start about mini HDMI connectors breaking off of the Mainboards in Notebooks. Not once happened with VGA. Or remember micros USB-B? Finnicky, prone to filling up with dirt, just really bad. Those DB-Connectors are mad for industrial (ab)use and it shows... Did I ever have prolems with bent Pins on DIN-Connectors? No. But on thos pesky mini-DIN: Yes!
    – kruemi
    Commented Jul 12 at 8:02
  • 2
    @AlexanderThe1st Serial, Parallel and Joystick all had Screws (or at least the possibility to screw them in). They were not always used, but the Terminals were there... So yes, it added significant bulk (and cost). But remember: those were industrial connectors (I don't know ith the HD15 VGA connector was used for anything else, but all the others were). So it made sense for them to have a secure connection.
    – kruemi
    Commented Jul 12 at 8:25
  • 5
    @AlexanderThe1st SUB-D is around since around the 50ies. Long before PC were even an Idea. They were made as industrial connectors. The PC industry in the beginning just used what was available on the market. So they used those industrial grade connectors. HD15 and DVI just share the same heritage (because people don't want to invent everything new if they have a working system that isn't broken).
    – kruemi
    Commented Jul 12 at 9:38
20

DSUB and DIN cables are bulkier because they are user serviceable.

You can repair them with simple tools and also wire up your own cables. You only need a screwdriver and soldering iron.

This was necessary because there was no common standards. Different brands were requiring different pinouts for the some function. Sometimes even different models of the similar products from the same manufacturer were requiring different pinouts.

3
  • It was fine until the distinction between DTE and DCE was introduced, as if everything has to be one or the other. From then on, you would never know whether the data (and handshaking) pin connections needed to be swapped, so you'd go on site armed with a screwdriver, pliers, and soldering iron (and possibly a break-out box), just to plug in a connector. Hey, I had a data logger, sometimes connected to the computer and sometimes to the modem. Disaster. Commented Jul 12 at 21:16
  • It might have had someting to do with suppliers wanting to automate the manufacture of connectors using ribbon cables, but without any line reversals. But they could have done that by putting all the 'ins' at one end and all the 'outs' at the other end. Commented Jul 12 at 21:23
  • And very importantly: that's how home computers started, just a bunch of guys putting off-the-shelf hardware together, and many computers were sold as kits (including the Apple I) you had to assemble yourself or at least had that as a cheaper option. And even in modern times, homebrew computers are quite likely to use such connectors given how much easier they are to build with and their durability. Though even that is changing recently with much faster and cheaper professional low-scale PCB production and easily available tools for working with SMT.
    – Luaan
    Commented Jul 13 at 11:15
17

Wide-pitch connectors are:

  1. Easier to wire up manually. If a wire breaks in a modern fine-pitch connector, you have to toss the whole cable. With older wide-pitch ones, you can DIY custom cables.
  2. Capable of carrying larger currents. USB-C is an exception, but it requires complex circuitry to negotiate and limit the current so it won't start a fire.
  3. More durable. They can withstand more insertion-removal cycles, more accidental forces. USB cables break all the time, that's OK when they're standardized. Pre-1990s, you might have to wait for replacement from the original computer manufacturer.
  4. Higher quality, when manufactured on low-precision equipment, in small batches, without 6-sigma quality controls.

For the same reason, modern prototype devices tend to remain bulky and use bulky connectors, including D-sub type.

In addition to pitch, some used parallel buses. user3528438 has already addressed the reasons that some older interfaces used a lot more pins than modern ones. But that's give-and-take; S-video had 4 pins and PS/2 6 pins, while modern HDMI has 19 and USB-C has 24 pins. In all cases, though, pitch was definitely wider in the old days.

2
  • 5
    This is the only answer mentioning current - older computers were notoriously inefficient and used higher currents. electronics.stackexchange.com/a/35464/133673 shows VGA was a minimum of 300 mA whereas DVI and HDMI's minimum is only 55 mA. Higher current means more cross-sectional area AND larger pins for more surface contact area, which takes space.
    – Criggie
    Commented Jul 12 at 12:02
  • @Criggie When VGA was introduced, no one was thinking about powering anything through that connector. The +5V for the DDC EEPROM a late addition to the VGA pinout. The highest current on a original VGA connector is around 1V analog output into 75 ohm termination at the monitor, whch is just 13mA. So it is unlikely that current handling was contributing to the connector choice. Commented Jul 14 at 16:14
15

Because they were already there.

You could buy D-type or DIN connectors from an electronics supplier, and incorporate them into the computer you were designing. This presumably left you free to concentrate on important stuff.

USB, etc., were inventions in the future. Meanwhile, there was money to be made selling computers.

D-type connectors were already in widespread use for computer equipment outside the "PC" space. Modems (which were self-contained boxes, not boards plugged into your computer backplane), for example, probably had D-type connectors.

DIN connectors were common in home audio equipment.

2
  • 3
    Exactly. PC companies didn't make their own components: they used what they could buy. D-subminiature connectors were small in their time.
    – John Doty
    Commented Jul 11 at 15:14
  • And equivalently, there really wasn't anything smaller. Commented Jul 12 at 8:40
10

Reason 1: early communication protocols are more primitive and a lot of control signals were needed but were neither multiplexed into data pins, nor implemented as commend sets. RS-232, LPT, IDE, SCSI, VGA belongs to this category.

Reason 2: Lost of early high bandwidth communication protocols needed many data pins to actually reach their performance goal. This naturally increased pin count, and also the number of ground pins to maintain data integrity. Contemporary high speed communication ports are all serial by nature and can reach high bandwidth with fewer pins, e.g. SATA vs IDE, USB vs LPT and PCI-E vs PCI(-X).

Reason 3: early connector depended on the provision of more signals for more functions. It's not considered possible to extend the functionality of the connector by extending the signal commend set. As a result, the female connector usually have provisions for all the functionalities that is possible and available, but the male connector may only connect a few of them. iPod, Zune, and early 2G dumb mobile phones belong to this type. They all have analog audio, serial communication and power built into one connector, but most user only use the power pins. The 25-pin RS232 is also an over design for most use cases, hence the 9-pin version is much more widely used.

Reason 4: In the early days, designers of the communication port sees very far into the past (for backwards compatibility) and very far in the future (for extensibility). DVI is a good example: it has provisions for analog signals, hence compatible with analog displays with VGA port. It also have provisions of dual link digital transmission, for double the bandwidth if needed. HDMI is a counter example that removes these provisions and rely on an active adapter (DAC) to convert to analog/VGA and rely on higher clock speed for future more bandwidth (like PCI-E). Modern smart phones now also removed the provision for analog audio and rely an active adapter/DAC for analog audio output. Fundamentally, after a few decades of actually using these devices, we have a better idea of what we need and what we don't.

Reason5: We are better at making connectors, and they becomes physically small. USB Type C is a good example. It has many pins, which runs at extraordinarily high speeds, but the connector is very small. USB-C carries no fewer pins than HDMI, SATA or USB mini-B (SuperSpeed), but is actually the smaller than all of them.

2
  • I'm really talking about the connector size. I understand that more wires were used e.g. for parallel connections, but why couldn't they use smaller connector? HDMI and DisplayPort have way more pins than, say, DIN5 or DSub9, but they're much smaller.
    – Vilx-
    Commented Jul 11 at 10:22
  • 1
    Pretty complete list, but you might want to add more relaxed FCC (or whatever your national board controlling radio emissions might be) regulations.
    – tofro
    Commented Jul 11 at 12:58
3

It's the good old chicken and egg issue.

In short, what changed is the need for smaller connectors that must pass data faster and the ability to make them cheaply enough to be viable, and also since connectors for faster data tranfer are available cheaper it boosts the developement of even faster data transfers because it can be passed through connector.

The most likely factor is of course the cost and the economy/market, and the reasoning is a bit circular.

First computers used connectors that connector manufacturers could make with their machines and tooling, so any connector used had to already exist, be cheap, available for purchase and able pass the signals required adequately. If you wanted a custom connector then that's fine but are limited by what the manufacturers can currently make and it's still expensive if you alone need it for some exotic purpose.

Often the connectors are defined by a standard, such as why DB-25 was used for RS-232 interfaces or DIN for MIDI. And like with Commodore products, if you already had used some connector for some purpose like the serial bus for disks and printers, why change it and make things incompatible.

When generally the technology advances on all possible areas for making computers, you suddenly have new data transmission needs and find you need better connectors that can pass data faster and be more convenient, but fortunately connector manufacturers are also able to make better and more suitable connectors for higher speeds, so then you can use them, and also define new connectors along with what manufacturers can make and fix these connectors into the standards. The standards also allow to define a custom connector best suitable for that standard and then as everyone starts to use it it will be cheaper than roll-your-own solution so everyone wins.

That happened for e.g. USB and DVI. USB first existed as electrical interface and as a protocol, but the connector was still not needed before first motherboards has USB chipsets, and before someone actually needed the interface. For example many Pentium motherboards had USB controllers but only as pin headers on motherboard, because nobody (users) had any USB devies yet, or OSes with drivers for USB, or no use for USB, or even know what USB is, and no idea how would USB even be a port somewhere on your computer. Then when technology had matured a bit there was standard connectors available you could put on your ATX motherboard and have a hole for it in the IO area of the motherboard.

Same with DVI. You needed a good connector for many purposes, not just for consumer use. It really needed to be supported by industrial and all other market segments to be accepted as suitable for use mechanically wherever it was needed. You cannot rely on a later invention called HDMI which is clearly a cheap consumer connector for home and light office use, the connector will wiggle out eventually and need replugging and there is a limited amount you can e.g. plug HDMI to your laptop until the pins left vs socket insertions curve drops below a critical level. DVI was a proposed standard for replacing older digital panel connectors with one standard that is based on serial link. Such older ones were OpenLDI, LDI, FPD-Link and others which simply used the LVDS link as base so there were more wires and they did not work for longer connections like DVI.

Also you say that DVI carries same signal than HDMI is not entirely correct. They are similar, HDMI is based on DVI and HDMI devices are required to be backwards compatible with DVI. They are similar as in they are CML electrical interfaces which use same basic TMDS data encoding protocol, but DVI and HDMI are not just different connectors, they are also different protocols where HDMI compatible source can identify a HDMI compatible sink and communicate all kinds of metadata such as "hey display there is now 10-bit YCbCr data in 1080p format and here's also the accompanying audio at 8-channel 96 kHz PCM" which makes it a consumer TV interface. DVI is limited to 8-bit RGB video only. HDMI also defined that it is possible to use HDCP between devices, which DVI initially did not support. HDMI standard also says that a HDMI connector must be used and it must support HDMI protocol, and DVI standard allows only DVI protocol as it does not know anything about HDMI.

0

Another factor:

The higher the specs of the cable the more precision is required in manufacture. Look at your USB-C cable--there's a lot of connectors very close together and not shielded from each other. To ensure that every lead connects with it's intended target and nothing else requires things to be pretty precise.

The old cables were far, far more tolerant of imperfection.

2
  • True, but does that mean that they couldn't do things as precisely in the 80's? They did have printed circuit boards and microprocessors, which should IMHO already have had a comparable level of manufacturing precision.
    – Vilx-
    Commented Jul 13 at 0:08
  • 1
    @Vilx-a circuit board is one item and is unlikely to matter if the whole design is shifted slightly. USB-C is items from different manufacturers. Commented Jul 13 at 1:26
0

The military and industrial heritage should not be forgotten. Modern (household-civilian) USB, HDMI and other similar connectors have very high requirements for careful use and stability of physical conditions, as well as very low life and impossibility of field repair. But they are cheap in mass production and convenient for automation. In the 80's and 90's the legacy of military requirements for physical reliability (vibration resistance, temperature fluctuations) was not completely lost. Plus there was still some development of technologies - thin wires, complex coatings, new plastics.... High frequencies and requirements to physical characteristics of conductors coming from these very frequencies...

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .