Over the years, we’ve seen a good number of interfaces used for computer monitors, TVs, LCD panels and other all-things-display purposes. We’ve lived through VGA and the large variety o…
The short answer is HDMI was mainly developed by a consortium of Stereo and Television manufacturers whereas DisplayPort was firmly always developed as a modern replacement for VGA.
Somehow, I trust the people in the computer industry to make better and more strict standards than I expect from the audio/visual industry. There’s a lot more advertising fluff from those groups while PC stuff can generally be nailed down by checking benchmarks against each other. How would you even benchmark two different stereo systems? (If I’m wrong and there is a way to benchmark them, cool, please share!)
Anyway, yeah, HDMI was for “Home Theaters” and pushed by the industry that builds that kind of thing and DisplayPort is for computers, period.
I used to think DisplayPort was the future, about 10-13 years ago.
By now I feel it has come and gone.
HDMI 2.1+ is making its way in everywhere.
It’s a better plug.
It tends to support enough pixels/Hz for most people.
It’s more ubiquitous, being on both TV’s laptops, and monitors.
Pretty sure the PC desktop segment will keep the port alive for a while, but right now it doesn’t seem like a very useful port apart from having a plug that claws itself in place and is often unnecessarily hard to unplug.
With Ultra High Speed HDMI (these names are ridiculous, seriously, look at the standard names) there’s very few, if any, reasons to use DP, apart from compliant HDMI cables costing an arm and a leg.
To be honest I’m struggling a bit to understand why it’s not just all pushed through a CAT6/7 Ethernet cable at this point.
DisplayPort is a better system than HDMI. It even can ride piggy back on USB-C, which means a display can both power a computer on the same line as it connects to a laptop with. DisplayPort also supports daisy chaining(although it’s not a common feature on monitors), so you could potentially have a single USB-C cable going to a laptop and then have multiple monitors connected with needing a dock or anything of that sort.
But (most of that) that’s the display port standard, not the plug.
DisplayPort over USB-C works mostly fine, except that it’s “fine”, not perfect. Daisy chaining tends to make it less fine.
It’s a better standard, but a worse plug. Important distinction.
That doesn’t matter in the long run though. Better doesn’t always win.
Just look at how USB won over FireWire. And FireWire could daisy chain too
My iPhone 13 Pro syncs slower over USB than my second generation iPod did over FireWire.
While I obviously can’t blame that fully on USB, it’s an ironic observation, especially since my OG iPod would be 21 years old now, if it still worked.
Your iPhone 13 syncs slower over USB because Apple decided to stay on Lightning connectors, which use USB 2.0 on the other end. Although FireWire was faster back when it co-existed with USB, the USB standard has surpassed it a long time ago with more power, faster speeds, and better physical connectors.
I know, and commented on it (just not explicitly).
The irony is still there though.
And for many years it was an actual limitation of the USB interface as well. Only with USB 3, which didn’t see widespread adoption until 2009-2010 did USB surpass FireWire 400 speeds. And let’s not forget there was FireWire 800 as well.
Shift the argument back to 2012 before Lightning and it still holds. Their point is that USB 2.0 is slower than FireWire was. FireWire had been dead for years by the time USB 3.0 came around, and USB 3.0 required bulky connectors that never really caught on with mobile devices. It wasn’t until USB 3.1 with the C-type connector came along in 2015 that mobile devices finally started seeing wired transfer speeds that could meet or exceed FireWire.
Why would you even want that, the locking mechanism is imo one of the advantages DP has over HDMI, I had one too many instances of the HDMI plug getting lose in the socket causing signal loss (granted, not a big issue for Home Theater but def. an issue for some people for PC usage).
Like that latching plug, dislike how manufacturers put it on only one side.
That’s the only real solution IMO, because monitor makers are gonna do what they do. Terrible frames, oddly angled, recessed. Once it’s in? Latching is awesome. Ever need to remove it? Fuck it, buy a new monitor.
You can, but they’re hard to come by where I live at least. I have two, but they’re Mini DP to DP. So haven’t gotten to use them since I was running my 970 ¯\_(ツ)_/¯
We also have 10-30 cables of each
of the usual lengths in the IT supply room at the office, and they all have the same mechanism, no matter what manufacturer was chosen during purchasing.
And then, once in a while, you come across a screen where the port is rotated 180 degrees so the push is between the plug and the back of the screen and you basically need child labor to unplug it in proper manner.
Just because it’s more ubiquitous doesn’t make it a better plug.
DP 2.1 is technically superior to HDMI in many ways, especially since it’s the protocol that runs through USB C and supports daisy chaining, unlike HDMI which has to be converted to and from DP to be passed through USB C.
I don’t see your argument with “it’s a better plug” either, the DisplayPort connector is available as a locking or non-locking connector, is keyed better than HDMI and in my experience is more solid.
The only reason HDMI is as common as it is is because there’s a consortium putting money into it to keep it popular because they can charge license fees from manufacturers. DP is free to use and developed by VESA, the standard can be downloaded directly from their website and added to whatever device you’re developing with no need to license. HDMI charges licensing fees PER PORT, which is why you often see GPUs with one HDMI and three DPs.
TL;DR: HDMI is only more common because a group has in its best financial interests to keep it popular, despite it playing catch-up with DP.
Never said DP wasn’t a better standard. Or that HDMI was better because of being ubiquitous.
It’s certainly not keyed better than HDMI, that’s just ridiculous. If you ever had a laptop with Display Port, or see people interact with one, you would notice that it’s too similar to USB, e-sata (which has become mostly irrelevant, if not even fully), and HDMI.
On top of that it’s difficult to feel if it’s the wrong way round. Easy to see, difficult to feel.
DisplayPort is in practice not available as a non-locking connector, but keying and the locking connector makes it worse.
While, sure, someone makes money off of HDMI, it’s not an argument for popularity.
I never agreed with the facts of your comment, but I think you are using poor arguments to make an invalid point.
I’m not arguing for HDMI either, btw. I’m just attempting to predict the future.
For speakers and headphones you can measure frequency response. But this is less fun than having a person listen and describe the sound with weird adjectives.
Prepare thine cochlear senses, oh noble audiophiles, for I’ve stumbled upon a sonorous marvel that’ll make your eardrums jitterbug like caffeinated squirrels at a techno rave. Upon placing these auditory gems upon your cranium, it’s as if you’re spelunking through the caverns of sound, where the bass is so profound that it feels like a cosmic beluga whale serenading a black hole.
Anyway, yeah, HDMI was for “Home Theaters” and pushed by the industry that builds that kind of thing and DisplayPort is for computers, period.
Their featuresets reflect this well. It’s hard to declare one better than the other, because that depends entirely on the application. Some people think they would like a displayport-based home theater setup, but they don’t realize how many features HDMI has that they unknowingly rely on like auto lipsync, eARC, CEC, etc.
Right but couldn’t TVs at least give me a choice? I’d love like a single eARC HDMI port with like 3 display ports and the ability to pass through USB data… Can you imagine how bitching that TV would be?
I’m not sure what I would do with three displayport ports and only one HDMI, it would break my current setup. If you have a modern console, you want at least two HDMI ports (one for the console, the other for eARC) since VRR turns into a crapshoot when you put a reciever in between your console and your display.
Right now I basically have all my high bandwidth devices plugged straight into my TV and my low bandwidth devices plugged into my AVR. eARC, CEC, and auto lipsync all make sure this works relatively seamlessly, and I don’t have to worry about the fact that my AVR doesn’t support all the HDMI 2.1 bells and whistles.
Adding a DisplayPort port to my TV would be nice, since it would allow for higher bandwidth signals on GPUs that lack HDMI 2.1 ports.
True, currently would be a bit of a mess but can you imagine if consoles had a display port as well and you could pair a controller or keyboard or whatever directly to your TV and use it to pass through data to either a console or PC connected to it without having to change anything from it being able to use the TV as USB Bluetooth adapter.
Really I just seem to want a higher PC monitor as a TV I guess, but universal remotes could really be universal with a setup like that.
But it ends up being a huge overhaul of everything we currently have and with no easy single cable swap solution really for lots of devices. And therefore lots of people hating it and why we stay with HDMI for that and other reasons.
The video industry is perfectly capable of good standards. SDI, for example, was invented in 1989 and it’s still the best way to transmit video today. DisplayPort has advantages, but it’s worse than SDI in most ways.
The short answer is HDMI was mainly developed by a consortium of Stereo and Television manufacturers whereas DisplayPort was firmly always developed as a modern replacement for VGA.
Somehow, I trust the people in the computer industry to make better and more strict standards than I expect from the audio/visual industry. There’s a lot more advertising fluff from those groups while PC stuff can generally be nailed down by checking benchmarks against each other. How would you even benchmark two different stereo systems? (If I’m wrong and there is a way to benchmark them, cool, please share!)
Anyway, yeah, HDMI was for “Home Theaters” and pushed by the industry that builds that kind of thing and DisplayPort is for computers, period.
I used to think DisplayPort was the future, about 10-13 years ago.
By now I feel it has come and gone.
HDMI 2.1+ is making its way in everywhere.
Pretty sure the PC desktop segment will keep the port alive for a while, but right now it doesn’t seem like a very useful port apart from having a plug that claws itself in place and is often unnecessarily hard to unplug.
With Ultra High Speed HDMI (these names are ridiculous, seriously, look at the standard names) there’s very few, if any, reasons to use DP, apart from compliant HDMI cables costing an arm and a leg.
To be honest I’m struggling a bit to understand why it’s not just all pushed through a CAT6/7 Ethernet cable at this point.
DisplayPort is a better system than HDMI. It even can ride piggy back on USB-C, which means a display can both power a computer on the same line as it connects to a laptop with. DisplayPort also supports daisy chaining(although it’s not a common feature on monitors), so you could potentially have a single USB-C cable going to a laptop and then have multiple monitors connected with needing a dock or anything of that sort.
But (most of that) that’s the display port standard, not the plug.
DisplayPort over USB-C works mostly fine, except that it’s “fine”, not perfect. Daisy chaining tends to make it less fine.
It’s a better standard, but a worse plug. Important distinction.
That doesn’t matter in the long run though. Better doesn’t always win.
Just look at how USB won over FireWire. And FireWire could daisy chain too
My iPhone 13 Pro syncs slower over USB than my second generation iPod did over FireWire.
While I obviously can’t blame that fully on USB, it’s an ironic observation, especially since my OG iPod would be 21 years old now, if it still worked.
Your iPhone 13 syncs slower over USB because Apple decided to stay on Lightning connectors, which use USB 2.0 on the other end. Although FireWire was faster back when it co-existed with USB, the USB standard has surpassed it a long time ago with more power, faster speeds, and better physical connectors.
I know, and commented on it (just not explicitly).
The irony is still there though.
And for many years it was an actual limitation of the USB interface as well. Only with USB 3, which didn’t see widespread adoption until 2009-2010 did USB surpass FireWire 400 speeds. And let’s not forget there was FireWire 800 as well.
Shift the argument back to 2012 before Lightning and it still holds. Their point is that USB 2.0 is slower than FireWire was. FireWire had been dead for years by the time USB 3.0 came around, and USB 3.0 required bulky connectors that never really caught on with mobile devices. It wasn’t until USB 3.1 with the C-type connector came along in 2015 that mobile devices finally started seeing wired transfer speeds that could meet or exceed FireWire.
Say what now?
I don’t like the DisplayPort connector.
Apparently an unpopular opinion, but hey. It’s mine, and I’ll keep it.
So it’s not actually a worse connector, then? Gotcha.
My arguments is that is. But, hey, read it whatever way makes you feel better about your own opinion.
You’ve not even presented an argument; you’ve only made a statement that it’s a worse connector. What are you basing this so-called argument on?
My preference would be using displayport but the actual connector is usb c
Thanks for the FireWire memories. I got the first Windows compatible iPod and bought a FireWire card just to use it.
Same here, actually.
Worse plug how? You can buy DP cables without the locking mechanism.
Example
Go to your local electronics store and see if that is true when you want a cable “today”.
There’s a difference between theory and practice.
Different countries and regions may have better markets for DP cables, but I can’t recall having had options other than length.
Why would you even want that, the locking mechanism is imo one of the advantages DP has over HDMI, I had one too many instances of the HDMI plug getting lose in the socket causing signal loss (granted, not a big issue for Home Theater but def. an issue for some people for PC usage).
YMMV, obviously.
Locking DP is a pain in the ass for me. HDMI disconnecting was never an issue either.
I just wish I had a choice, but in practice I don’t as selection of cables is poor.
I never screwed in VGA or DVI connectors either.
Personally I prefer the plug falling out instead of the connector and/or cable getting damaged if you pull on it.
My monitors support daisy chaining and I just need to connect my laptop via USB-C. It’s really so much better and makes cable management easier.
Glad to hear it works. I know it is technically possible, but haven’t seen it in the field yet.
I have to admit I have a Lenovo laptop and monitors. I haven’t tested it with other brands.
Because of some HDMI licensing bullshit, HDMI is limited to 4k60 when using open source graphics drivers.
Really? First I’m hearing of it, are you sure you just don’t have an HDMI 2.1 cable?
I guess I’m the only one who LIKES the latching plug.
Like that latching plug, dislike how manufacturers put it on only one side.
That’s the only real solution IMO, because monitor makers are gonna do what they do. Terrible frames, oddly angled, recessed. Once it’s in? Latching is awesome. Ever need to remove it? Fuck it, buy a new monitor.
Does it actually do anything except make sure you rip the head off of your cable if it gets pulled too hard
Just FYI, you can get DP cables without the retention clips. I too find them unnecessary and annoying.
You can, but they’re hard to come by where I live at least. I have two, but they’re Mini DP to DP. So haven’t gotten to use them since I was running my 970 ¯\_(ツ)_/¯
We also have 10-30 cables of each of the usual lengths in the IT supply room at the office, and they all have the same mechanism, no matter what manufacturer was chosen during purchasing.
And then, once in a while, you come across a screen where the port is rotated 180 degrees so the push is between the plug and the back of the screen and you basically need child labor to unplug it in proper manner.
That ought to be a crime! I would be getting fed up with that pretty quickly and making some improvised modifications to disable the retention clips!
In this situation I typically pull out my Leatherman’s knife to depress the clips, but it’s definitely not easy
Really? It’s been years since I’ve seen a display port connector with the latches on them.
Just because it’s more ubiquitous doesn’t make it a better plug.
DP 2.1 is technically superior to HDMI in many ways, especially since it’s the protocol that runs through USB C and supports daisy chaining, unlike HDMI which has to be converted to and from DP to be passed through USB C.
I don’t see your argument with “it’s a better plug” either, the DisplayPort connector is available as a locking or non-locking connector, is keyed better than HDMI and in my experience is more solid.
The only reason HDMI is as common as it is is because there’s a consortium putting money into it to keep it popular because they can charge license fees from manufacturers. DP is free to use and developed by VESA, the standard can be downloaded directly from their website and added to whatever device you’re developing with no need to license. HDMI charges licensing fees PER PORT, which is why you often see GPUs with one HDMI and three DPs.
TL;DR: HDMI is only more common because a group has in its best financial interests to keep it popular, despite it playing catch-up with DP.
Never said DP wasn’t a better standard. Or that HDMI was better because of being ubiquitous.
It’s certainly not keyed better than HDMI, that’s just ridiculous. If you ever had a laptop with Display Port, or see people interact with one, you would notice that it’s too similar to USB, e-sata (which has become mostly irrelevant, if not even fully), and HDMI.
On top of that it’s difficult to feel if it’s the wrong way round. Easy to see, difficult to feel.
DisplayPort is in practice not available as a non-locking connector, but keying and the locking connector makes it worse.
While, sure, someone makes money off of HDMI, it’s not an argument for popularity.
I never agreed with the facts of your comment, but I think you are using poor arguments to make an invalid point.
I’m not arguing for HDMI either, btw. I’m just attempting to predict the future.
For speakers and headphones you can measure frequency response. But this is less fun than having a person listen and describe the sound with weird adjectives.
Prepare thine cochlear senses, oh noble audiophiles, for I’ve stumbled upon a sonorous marvel that’ll make your eardrums jitterbug like caffeinated squirrels at a techno rave. Upon placing these auditory gems upon your cranium, it’s as if you’re spelunking through the caverns of sound, where the bass is so profound that it feels like a cosmic beluga whale serenading a black hole.
this one sounds like silver monster cable.
this other one sounds like a straightened clothes hanger.
(they both sound the same)
Their featuresets reflect this well. It’s hard to declare one better than the other, because that depends entirely on the application. Some people think they would like a displayport-based home theater setup, but they don’t realize how many features HDMI has that they unknowingly rely on like auto lipsync, eARC, CEC, etc.
Right but couldn’t TVs at least give me a choice? I’d love like a single eARC HDMI port with like 3 display ports and the ability to pass through USB data… Can you imagine how bitching that TV would be?
I’m not sure what I would do with three displayport ports and only one HDMI, it would break my current setup. If you have a modern console, you want at least two HDMI ports (one for the console, the other for eARC) since VRR turns into a crapshoot when you put a reciever in between your console and your display.
Right now I basically have all my high bandwidth devices plugged straight into my TV and my low bandwidth devices plugged into my AVR. eARC, CEC, and auto lipsync all make sure this works relatively seamlessly, and I don’t have to worry about the fact that my AVR doesn’t support all the HDMI 2.1 bells and whistles.
Adding a DisplayPort port to my TV would be nice, since it would allow for higher bandwidth signals on GPUs that lack HDMI 2.1 ports.
True, currently would be a bit of a mess but can you imagine if consoles had a display port as well and you could pair a controller or keyboard or whatever directly to your TV and use it to pass through data to either a console or PC connected to it without having to change anything from it being able to use the TV as USB Bluetooth adapter.
Really I just seem to want a higher PC monitor as a TV I guess, but universal remotes could really be universal with a setup like that.
But it ends up being a huge overhaul of everything we currently have and with no easy single cable swap solution really for lots of devices. And therefore lots of people hating it and why we stay with HDMI for that and other reasons.
The video industry is perfectly capable of good standards. SDI, for example, was invented in 1989 and it’s still the best way to transmit video today. DisplayPort has advantages, but it’s worse than SDI in most ways.
ima need you to expand on that one a bit because from what my amateurish search brought up SDI is pretty much garbage spec wise compared to DP