TL;DR: Progress
- Most important: Volume
- Second: Reduced Variation
- Third: Both Sides Need to Agree
- And Last: Changed Technology
- .. oh, and again: Volume
In 1970 a computer selling 100 units was a success. By 1980 selling 20.000 of a single type was huge. Today a laptop model selling only a million units is a Flop. And while the market in 1980 was already past a million units per year, each manufacturer had it's own interfaces, especially for video - if not integrated, or present at all.
A Sub-D connector can be soldered without any tool, using manual labor. No matter if PCB or cable. Perfect for any volume, especially a low starting volume. At the same time it can be machine manufactured (wave soldering etc. when production ramps up. And up to the mid 1980s all micro computer manufacturers were rather on the start-up side.
'A' small connector is of no good as long as every manufacturer uses a different. It wasn't until the 1990s that the 1987 introduced VGA became a (mostly *1) accepted standard. There can't be a standard connector without a widely accepted interface standard. For components this meant that only a standard connector allowing variations, like D-Sub, will gather enough customers to drive the price down, making it more attractive than any nice small but special one produced in low volume.
Next, new, smaller connectors can only be added if both sides agree, computer and printer / modem / keyboard / screen / etc. Easy in a closed situation, like Apple showed with some of their machines (*2), next to impossible in an open environment, where it's for manufacturers important to offer interfaces compatible to as many different computers or peripherals as possible.
This only changed somewhat when one platform, the PC, acquired an overwhelming market share, making most manufacturers of peripherals using its interfaces by default. Still it was a 10+ years fight to even get those seemingly clear defined standard interfaces really working the same on all devices (*3).
Last, those smaller connectors need different mounting. A D-Sub can be mounted in thru hole style. Its pins not only neatly fit the 2.54 mm grid of 1980s PCB technology, but also gives it a stable mechanical mounting. Smaller connectors mean more dense pin arrangement than 2.54 mm, something that did not fit the existing technology (*4). To fit a connector it would need a sub-PCB to break out the connections to 2.54 mm standard. Only introduction of SMT would allow to fit such small devices, except by 1986, when most PC interfaces were already defined, SMT was cutting edge and only used in less than 10% of all PC manufacturing.
It took a good decade to roll out SMT across the industry enabling the use of smaller connectors without notable increase of cost. Exactly the time when smaller connectors became to be considered and introduced. Nice, isn't it :)
Oh, and there is one more, one trumping all others, but easy ignored when just looking at technology:
Why At All Would One Need a Smaller Connector?
I mean, why would someone need smaller connectors than D-Sub? A PC is installed once and done. Maybe upgraded after 3-4 years with more memory or a better screen, but that's it. It's important that all connectors stay in place, not how big they are. They are at the rear anyway and even hardware nerds don't replug their devices every other day.
Modern laptops (that is, slim line *5), tablets and phones have different requirements. They get unplugged and moved all the time, used in different configurations and offer at the same time way smaller rims. Here easy pluggin and smaller size is helpful (*6).
But they are a development of the naughties. Before that, there was no real need for slim line connectors like USB or HDMI on computers.
The Details:
Earlier connectors for devices attached to personal computers tend to be very bulky.
Are they? I'd rather say they are a marvel of compactness compared with what was before. D in D-Sub stands for the D-like shape, while Sub means Subminiature - as it was considered extreme small when introduced in 1952.
D-Sub and Mini-DIN were the tiny connectors of their time.
When early computers started they neither had the investment nor the volume to create new connectors they used what was there and D-Sub was a versatile standardized choice for them. Not to mention that those early micros were bulky themself.
In contrast, today we use the much smaller connectors of HDMI, USB and DisplayPort. Each with their own variations, but as a rule small, easy to use, and no screws involved.
Well, this "Each with their own variations" is the second major difference:
Each of the 'new' ones are single purpose connectors.
D-Sub and (Mini-)DIN (*7) are general purpose connectors with no specified use case but connecting. They universal connectors, meant to be used in cases not already defined when they were designed - as well as open to adaption. Adaption is what made D-Sub ad Mini-DIN the main choice for all around usage, the PC being the best example:
- DB-25 for parallel
- DA-15 for Joysticks
- DE-15 for VGA and
- DE-9 for serial
All using the same technology, tools and supply chain, but vastly different use cases (*8).
All the 'new' ones are special to type, created for one use and one use only, with a fixed layout, fixed signalling and no customisation intended.
What changed that made this possible (or why else wasn't it done earlier)?
First of all
(The DVI connector is particularly perplexing, because it carries the same signal as HDMI, so it's not as if the communication technology improved there.
Not really, as DVI also (can) carry analogue, VGA type video signals. And that's important as DVI was a bridge technology to allow a smooth transition between analogue transferred VGA and doing the same digital (*9). The same video card could be used to drive analogue or digital screens, while either screen could also use the same cable, independent of transport used.
USB was also already a thing with its smaller and more convenient connectors. It should have been possible to make a connector similar to HDMI right from the start)
Sure, but they would require more changes in infrastructure (computers, graphic cards screens and cables) at once, an upfront high risk investment no diverse industry likes to jump on. USB only worked as it was a new device with a new use case, offered as add on, not replacement - just remember how long PC's still carried PS/2 for more than a decade in parallel even when USB HID were already well established.
The slow switch for a smaller USB connector does in fact make a perfect example for how hard it is to establish a new connector for an existing interface. New computers and even tablets still carry in 2024 the 1994 USB-A socket, despite Mini being introduced in 2000 and micro in 2007 (*10). Only he 2014 USB-C does make some ground ... 10 years after introduction.
DVI had the same issue and was made to cover it by creating as much compatibility as possible. DVI
- fits thru hole mounting,
- fits existing connector mounting,
- offers the same mechanical stability,
- (can) avoid unintended disconnection,
- does not need new handling in production,
- can be used in the same environments as VGA.
All points that are important providing a smooth transition between VGA and DVI. DVI was all gain no loss(*11).
HDMI in turn
- does need SMD mounting,
- needs new mounting
- is way less mechanical stable
- low resistance against unintended disconnection
- can only be used in moderate conditions.
In fact, HDMI wouldn't be much used with computers if it wasn't made as a consumer standard for TV, build on DVI standards. Consumer electronics do not only have way less strict requirements for reliability but most important extrem high sales volume, thus making all parts dirt cheap. As a result HDMI was
- compatible with DVI,
- allowing direct connections between DVI cards and HDMI displays,
- allowed (to some degree) use of HDMI source for DVI displays,
- reduced production cost.
For PC manufacturers this meant as first step DVI to HDMI cables were all what's needed to reach more display solutions (aka TV's), while display manufacturers could make their displays fit both worlds. Being compatible meant that makers of devices with restricted space could use HDMI connectors while still connecting to DVI screens. With enough wide spread HDMI displays available (aka sold as TV), it made sense for all PC manufacturers to add HDMI instead of DVI. All of that 'good enough' for average office installations. So the next smooth transition happened.
Bottom line: It's a story of progress quite similar as from bulky pre WW1 cars to today's low profile cars (*12).
*1 - Non-PC, like workstations (think Next) or Apple still used their own designs way into the 2000s. Apple even with their own version of DVI, the ADC, essentially the same, but different connector, requiring a USD 150 cable to connect :)
*2 - Just think ADB or the square SCSI connector used by some powerbooks. Of course increasing the price tag for those devices as well as for anyone else who wants to connect.
*3 - Some may still remember the hassles a seemingly simple 'standard' prot as the PC's parallel port gave us in the 1990s.
*4 - Some may remember the drive of Japanese companies for a 2 mm grid in the late 1980s as an attempt to increase THT density and pin count.
*5 - An IBM Portable PC, Osborn or Compaq was as big as a desktop PC, plenty room for D_sub. Likewise next to all laptop from early Toshiba until the late 1990s early 2000s. Only after that a full figured PC could be put in a case smaller than several cm.
*6 - Also a reason why USB-C is taking over from the bulky HDMI connector :))
*7 - Mini-DIN already being a shrink from DIN connectors for convenience.
*8 - At that point it's important to keep in mind that seemingly default assignments used by the PC is an IBM specific definition made exclusive for the PC. Other computers had other uses for the same connectors and/or other connector variations.
*9 - Also the reason why the digital protocol is still tied to video timing.
*10 - Does anyone remember those Super-Speed monsters?
*11 - Note that Mini-DVI and Micro-DVI are custom connectors made by Apple for their laptops, not any standard development.
*12 - Pre-SUV era that is.