Hacker Newsnew | past | comments | ask | show | jobs | submit | eikenberry's commentslogin

This also assumes that IQ testing has remained static. It has not. IQ tests continue to evolve and there are >1 of them and they do not all agree. I.E. the tests themselves might be responsible for some of the variance.

AMD. The final holdout, HDMI 2.1 support being blocked by the HDMI group, has been overcome w/ the HDMI group relenting and support is now landing in the kernel (expected in 7.2).

https://www.gamingonlinux.com/2026/05/further-expanded-amd-h...


I sort of figured that HDMI stupidity was strategically a good thing as it sort of brought the dynamic of the HDMI consortium and VESA. specifically how they treat the end users, more to the public eye.

That is, more people being subtly pushed to using display port is not a bad thing.


I was faintly surprised that my recent monitor purchase came with a displayport cable.

Didn't help connecting it to my Macbook, but still..


Don't most monitors ship with DisplayPort cables? All of mine have. HDMI is more popular with TVs/home theater systems.

DisplayPort has been running the best PC high end monitors for a long while. HDMI OTOH has been in A/V land (DRM management).

I didn't follow this story much: how exactly did they get past the legal hurdles? Or there never actually were any hurdles, just sabre rattling?

Purely rumor, but supposedly Valve put tons of pressure on them (no idea by what means, again this is all rumor) because they wanted support for the Steam Machine release.

any reason why we are using hdmi over display port?

Unless you're on the absolute newest stuff with DisplayPort 2.1, HDMI 2.1 has more bandwidth than DP1.4. That'll be Nvidias 2000 through 4000 series. No DisplayPort 2.1 until the RTX 5000s.

And then monitors released during this time generally do the same too.

Also if you want to use it through a capture card, HDMI ones are way more common and cheaper


AMD Radeon 7000 and 9000 series all support DisplayPort 2.1

The vast majority of the TVs only come with HDMI .. not even good enough analog inputs anymore..

I have been told (but not confirmed) that is mandated by the HDMI mob. If you want HDMI on your TV, it cannot also have DP.

This can only be true for consumer-grade stuff. Even then I just guess the manufacturers kind of cheap out.

I have a dumb-ish Samsung Hotel TV / commercial TV at home. It has DP.


I want a TV with DP. Do you have a recommended source for where to pick up commercial TVs?

Which is kind of funny. At least, to my mind this has associated HDMI-only with the budget option (TVs), and DP with the premium tier (monitors).

What really drives me nuts is smart TVs with 100mbps Ethernet connections. When I bought a tv we looked in vain for gigabit Ethernet.

It is futile to expect the TV to be smart and support all sorts of apps and hardware only to be abandoned by the manufacturer years down the line. The only correct way to buy a TV imho is to hunt for a dumb but excellent display properties and get a streaming device such as Google TV Streamer, Apple TV or DIY x86 HTPC.

>DIY x86 HTPC

ARM slander was not warranted


Are there DIY Arm boards that make a good HTPC? Do they have hardware video decoding?

Unfortunately we're the weird ones for wanting to stream >100mbps content.

My 2020 LG CX has a USB 2.0 port and I get ~300mbps with a gigabit adapter, if the TV you ended up with has a USB port it's worth a try.


TVs are made with BOM of like 10$ for the SoC, so it's the cheapest crap available.

Then again - none of the streaming services are streaming at anything remotely close to 100Mbps so I doubt they consider it necessary to upgrade to GbE.


Some people have TVs or displays that only use HDMI. I personally wouldn't recommend HDMI if DisplayPort is available, but if HDMI is your only option, then having it work properly will be important.

My monitor has 1 displayport and 2 hdmi and I have 2 computers I use with it. They can't share the displayport. All comparable monitors (last time I checked) have the same. So it'd be nice if both worked.

For one, DisplayPort doesn’t support HDR output

That can't be right. I'm reading this comment on an HDR monitor over DP right now.

Don't all USB-C video outputs use DP alt mode too, with an HDMI adapter at the end? And they can do HDR.


The cable length limitations are also a pain in the ass for not-uncommon A/V system configurations. 6' recommended max, and the best you might get working stably if the device and cable gods smile on you is 15'. 6' is the lower edge of acceptable for just about any A/V system setup (in practice it means your devices need to be within about a meter of the screen's port[s], which is pretty close) and even 15' is still too short to be useful for, say, a projector, or a "the A/V receiver or HDMI switch is over in that cabinet, the TV is on this wall across the room" situation.

HDMI goes 25'+, no problem.


For 4k at 60Hz, you'd need HDMI 2.0 or DP 1.2. At those speeds, both kinds of cable should be able to reach 25 feet, and I can find reputable brands selling both kinds at the length.

> HDMI goes 25'+, no problem.

Yep. That's likely because that's an active cable. Active DisplayPort cables exist, too. Here is one vendor selling active UHBR10 cables [0]. If you don't NEED UHBR, then you'll find your selection to be much, much larger. I've been using some Monoprice-branded 50 and 100 ft active fiber-optic HBR3 DisplayPort cables for years with no problem.

[0] <https://www.bhphotovideo.com/c/products/displayport-cables/c...>


displayport has supported HDR10 since 2016

and displayport 2.0, since 2019, has supported all the same variations (hdr10+, dolby vision) that HDMI does


Do you mean in practice, or something? DP definitely supports HDR, and it seems to work fine for me.

This seems wrong to me? I use it to do so every day.

If true, not supporting HDR is a feature

Everyone knows this is true of every large enterprise shop and is one of the reasons that wall street rewards layoffs.

If Wall Street was so wise they would only reward meaningful layoffs. Laying off 10% of a company by stack ranking every team accomplishes nothing. Particularly if the company just hires the same number of cut people next quarter.

If a tree has a dead branch, you cut it off. Cutting off 10% of the leaves evenly distributed among branches will remove some dead leaves, but it leaves the source of the problems unaddressed.


The only time layoffs work is when the go with cutting the product worked on completely from the company. Everything else should be managed by not growing too big when times are well, and not hiring when someone leaves when times are bad. There will be ups and downs in your market, figure out what they are and ensure your headcount matches that long term, ride out the bad times with no profit knowing they will get better again - cutting all other costs.

But then you are asking hard questions. Which leaves? Which branch? How do I create a protective legal theory to prevent numerous lawsuits?

Best just to do it randomly to avoid hard decisions and lawsuits.


Most short term changes with stocks like this are wholly irrational and have very little to do with accomplishing anything other than signalling.

I think it will eventually be its own dialect of English. Telling LLMs what to do is better using not quite normal English and I think this will continue until it isn't recognizable as natural English anymore, but a new fuzzy programming language (probably >1).

I believe new (programming) languages will emerge both for LLMs to parse and take instructions from as well as for them to generate code in. The former is because English is a nuanced language evolved for human usage which the LLMs don't quite need, with the only advantage of it being a metric ton of training material. Same goes for Rust, Go and other languages LLMs do primarily well coding in, which all have concepts geared towards human convenience.

>Telling LLMs what to do is better using not quite normal English

What are your prompts like?


Did you specifically re-enable javascript? Ublock origin on medium mode blocks all the tracking javascript and I'd think advanced would follow the same basic starting point.

Yeah, didn't work without it.

I think the "They" mentioned was Docker, not Podman. That Docker was adopting the containerd standard.

Jup that’s definitely it, not sure how I got it that wrong.

It's not just you. I interpreted it similarly

Cheating is a social issue, not a technical one. Communities are the solution.

Private servers are a nice way to do this and do still exist in places. My favorite online game uses them along with server side anti-cheat and while cheating occasionally happens, it has never been an ongoing issue. I've maybe seen a cheater once or twice in all my many hours playing the game over 10 years (elite dangerous, in case you were curious).


They are bookmarks. The more people who bookmark something the more attention it has. That attention is what you care about and is why that has become a metric people use (that and the fact that there is little else).

>(that and the fact that there is little else).

https://en.wikipedia.org/wiki/Streetlight_effect


Open source can be a hobby, but is can also be a portfolio. So not a paid job, but a way to get paid jobs. Tech interviewing is so incredibly broken that you really need every option working for you and cannot necessarily afford to "just stop".

Open models running locally is the answer. Relying on proprietary, closed software always puts that company's priorities above your own when using their software. You have given up control.

While running them locally presently doesn't make sense economically, you don't need to run them locally to address this issue. There is a lot of competition in hosting open models and you have a variety of services to choose from. Run the open models now, reward that ecosystem instead of continuing to reward closed systems that dreams of rent-seeking.


You don't need to run the model locally if you don't care about sharing your data. Personally I am happy to share data with Kimi or Deepseek if it means we get better OSS models. For private stuff though local is king

It'll be a while yet before open models that're good enough will be viable for local use. Heck I've been trying to use the Qwen 3.5 39B A3B on my system, which is modest but no slouch, and have only been able to get ~4.5 tok/s after optimization, and it really runs my system red (fans instantly go crazy). It's just not practical for serious work.

I've been using Qwen 3.5 and then 3.6 27b Q4 on Ollama with a single 7900 XTX with the codex cli, and I have been blown away by how genuinely useful it is. I've been able to ask it to do long, multi step problems, and it's able to do things that would have likely taken me days to iron out in a matter of hours, or even minutes sometimes.

I get about 30 tok/s, which is far from blazing, but given the capability it has it is absolutely viable for accelerating my work.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: