Hacker Newsnew | past | comments | ask | show | jobs | submit | vehemenz's commentslogin

Can you provide some support for your moral position? You’ve also put “insane horror” in scare quotes, which honestly I find troubling.

Does your moral account provide some justificatory, non-antisemitic framework based on colonialism or oppression that allows us to sidestep the issues with Gazans’ support of Jihad, other extremist doctrines, and the extermination of Jews?

It’s kind of a rhetorical question, but it’s the least I would expect for someone to argue credibly about the morality of the conflict.


They're not "scare quotes", they're quotes.

Huh? In English orthography, quotes can serve multiple purposes.

Most native English speakers wouldn’t see the parent’s use of quotes (quotation marks) as merely mention.


This isn't the least bit confusing man. The author confirms the purpose of the quotes, which is effortlessly understandable by native and non-native speakers alike. They were used to quote and comment on a direct phrase from the parent comment.

You absolutely do not need to double down on whatever it is you are doing here lol. Say you were wrong and move on.


Yeah, they're just quotes. It's a quote.

>Can you provide some support for your moral position?

Yes, of course. A bunch of people from Europe decided to move to Palestine and start a religious ethnostate there. In doing so they expelled and murdered lots of local residents.

The surviving locals are understandably less than happy about this, and have continued to fight to defend their lands to this day.

Since then, the those people have caused far more harm to non-jewish Palestinians than non-jewish Palestinians have caused to the those people.

>allows us to sidestep the issues with Gazans’ support of Jihad, other extremist doctrines, and the extermination of Jews?

It's perfectly natural that Gazans would support the extermination of jews. In the extreme environment that Israeli jews force Palestinians to live in, it's fundamentally ridiculous to even describe it as an extremist position.

In a comfortable European context it's certainly extreme, but that's a fundamentally dishonest way of portraying it.


"It's perfectly natural that Gazans would support the extermination of Jews."

That's not a point about colonialism or occupation; that's a justification for ethnic extermination based on the conditions of the people holding the position. By that logic there is no floor: any atrocity becomes "perfectly natural" if the grievance is large enough.

In your broader argument you're describing a blood debt with no statute of limitations and no mechanism for resolution. Skåne (where I live) was Danish. Alsace was German. Most of Europe was Roman. At some point borders exist, people live within them, and the only available direction is forward. You haven't described a political framework- you've described a permanent state of war with no exit condition except one side's disappearance.

You just said it's perfectly natural to want to exterminate an ethnic group. Read that back.


>By that logic there is no floor: any atrocity becomes "perfectly natural" if the grievance is large enough.

This is mostly true, yeah. Do you not believe that humans act like that?

>In your broader argument you're describing a blood debt with no statute of limitations and no mechanism for resolution

Nonsense.

>Skåne (where I live) was Danish. Alsace was German. Most of Europe was Roman. At some point borders exist, people live within them, and the only available direction is forward.

Except Israel does not want Palestine to move forward.

There are approximately zero living people that give a shit about the things you mentioned, there are millions of living Palestinians who do care and suffer at the hands of Israeli state every single day.

How did you intend for this comparison to be even vaguely relevant?

>you've described a permanent state of war with no exit condition except one side's disappearance.

This is deliberately obtuse, Israel has had a plenty of ways to largely exit this conflict. At the very least they could've given all Palestinians Israeli citizenship and equal rights decades ago.

Of course, that's not really compatible with the ideals of the jewish ethnostate. I'm sure the Palestinians wouldn't seriously object though.

>You just said it's perfectly natural to want to exterminate an ethnic group. Read that back.

I'll repeat it if you want me to. We've seen it over and over again in history, it's hardly a new thing.

Considering how the jewish people choose to treat the Palestinians, it is not surprising that Palestinians want to exterminate the jewish people. It is a perfectly predictable reaction, and not some special quirk of the Palestinians.


Israel has been at the table at Oslo, Camp David, Taba and Annapolis. Palestinian leadership walked away from each without a counter-proposal. That's not a secret.

Since I already wrote a reply to your now-deleted comment:

>You've just made my point. The reason Skåne isn't contested is that there are no living people suffering under Danish occupation of it. You've described the mechanism yourself: time plus resolution. That's exactly what a two-state settlement would produce. You've argued for the process without noticing.

Israel has explicitly rejected that over and over again, and continues to do so every day through ongoing annexations.

There will never be moral high ground for the state of Israel as long as it allows the settlements to exist and doesn't at the very least honor it's internationally recognized borders.

>On citizenship: Israel granting full citizenship to all Palestinians would mean the end of a Jewish majority state within a generation, by demographics alone. You know that. Proposing it as a "simple solution Israel refused" is not a good faith argument; it's describing the dissolution of Israel but painting it as moderation.

A jewish ethnostate is as morally unacceptable as an aryan ethnostate.

>On extermination: you've moved from "perfectly natural" to "perfectly predictable." Those aren't the same thing. Predictable means understandable given the circumstances. Natural means it requires no further justification. You've retreated and you haven't noticed.

No, it's both. It's predictable because it's a natural reaction.


Deleted it, yeah, I've decided it wasn't worth it. Still isn't, mostly.

This will be my last comment to you, I don't want to engage with someone so comfortable at defending genocide.

One final fact check for you: a state with 20% Arab citizens who vote, sit in the Knesset and serve on the Supreme Court is not comparable to a state founded on racial extermination. That comparison doesn't survive contact with the facts.

The settlements are illegal and indefensible. I said so already in this thread.

Everything else you've written today you can own.


>One final fact check for you: a state with 20% Arab citizens who vote, sit in the Knesset and serve on the Supreme Court is not comparable to a state founded on racial extermination. That comparison doesn't survive contact with the facts.

It is an indisputable historical fact that jewish zionists expelled vast amounts of Palestinians from their homes and forced them out of the territory of what is now modern Israel.

>This will be my last comment to you, I don't want to engage with someone so comfortable at defending genocide.

I'm not defending genocide, that's a ridiculous interpretation of my words. I'm just pointing out the fact that if you keep poking someone long or hard enough, you shouldn't be surprised when they eventually want to get rid of you.

I certainly don't think the extermination of Israeli jews would in any way be a positive outcome.


No one disagrees that the Gazans feel the way they do. But your position is a stronger one. You seem to be excusing or justifying the moral behavior of Gazans in a way that looks self-undermining.

It’s not morally credible to focus on the Jews’ actions alone, given the broader context of the conflict, Islamic conquest and domination. I don’t want to be patronizing and give history lessons, but antisemitism, Jihadism, and other Islamicist extremist doctrines predate the state of Israel by centuries.


So, are you saying that it's not justified for Gazans to feel the way they do? Why not?

> It’s not morally credible to focus on the Jews’ actions alone, given the broader context of the conflict, Islamic conquest and domination. I don’t want to be patronizing and give history lessons, but antisemitism, Jihadism, and other Islamicist extremist doctrines predate the state of Israel by centuries.

That's just whataboutism and has approximately nothing to do with the conflict started by the creation of the modern state of Israel.

The only people who you could reasonably blame besides the zionist jews are the other Europeans.


I don't think there's that much of a distinction.

The real difference is that a "true professional" already has the software—purchased at full price by themselves or by their employer—and doesn't need a subscription in the first place.


The biggest distinction, in my experience, is that prosumers tend to be means-focused and professionals tend to be ends-focused, so there's less zealotry and evangelism in professional circles.


Also in professional circles, there's usually one or two industry standards and you just use what everyone else is using.


Many people that use professional tools are genuinely doing hobbyist stuff. Especially if they haven't already bought their tools outright.

But besides, this subscription works with Family Sharing and is only $12, so it looks easy to get your money's worth.


This is a weird one. I think their reasoning was that most people don't use Launchpad, so they integrated it into Spotlight to eliminate redundancy.

I much prefer the new app launcher in Tahoe, but it was created at the expense of Launchpad, which some people actually relied on. I don't know why they couldn't have kept both options.


And yet, sometimes the criticism is warranted, and sometimes it's not. That's why it's good not to overgeneralize about patterns.


I think that’s unnecessary waffling. Of course there are exceptions, but the prejudiced negative views that the old and the young hold of each are generally wrong.

For example, the constantly recurring critique that the music of the young is not about musicality[1] is always wrong. It's as wrong today as it was about Elvis.

[1] https://news.ycombinator.com/item?id=45637667#45639674


Shortcuts.app and AppleScript works for this.


Tailwind is almost too simple to bother using an LLM for. There’s no reason to introduce high-level abstractions (your “real” CSS, I imagine) that make the code more complicated, unless you have some clever methodology.


Can you explain? Tailwind massively reduces overhead for abstraction, classing, documentation, and maintenance.


AFAICT, Tailwind is largely (not entirely) a different, shorter syntax for writing inline styles. (E.g., "class: 'bg-white'" = "style: 'background-color: white'".)

If you've rejected structural CSS to begin with, I sort of get the point that it saves a lot of typing; otherwise I don't see how it helps all that much over SASS or just modern plain CSS.


Tailwind is a dirty hack, normally you are supposed to declare a class, which you apply to items of the same concept. This is the cause for CSS to exist.

Front devs got lazy, and started writing for each element, position: absolute; left:3px, top:6px, color:red;...

You could write <font color="red">Hello</font> this would be similar "cleanliness"


Can’t wait for Headwind CSS implemented as custom elements.


The top menu is not necessarily for common functions. Nor is it true that no one uses it.

Chrome’s unique buried menu breaks user expectations. Casual users have trouble finding it.


Do you have a cite for that? It's a hamburger menu, which is by far a more common menu paradigm in the modern world. Do you imagine some kind of "casual user" who... doesn't have a phone?

I repeat: the menu bar is a dying abstraction, preserved in a consistent form only on the Macintosh (even iOS has no equivalent), because it's presence is unavoidable. Users of modern apps don't see it used consistently, so it's absolutely not surprising that Apple's designers aren't doing it either.


The author addresses this. Humans are the same in 2026 as 1992.

Besides, that interface designers or even the average computer user understands more than in 1992 is highly implausible on its face.


Humans are definitely not the same as in 1992 when it comes to their everyday knowledge of computer interactions.

And even if human cognition itself were unchanged, our understanding of HCI has evolved significantly since then, well beyond what merely “feels right.”

Most UX researchers today can back up their claims with empirical data.

The article goes on at great length about consistency, yet then insists that text transformations require special treatment, with the HIG example looking outright unreadable.

Menu text should remain stable and not mirror or preview what’s happening to the selected text IMHO.

Also, some redundancy is not necessarily a bad thing in UI design, and not all users, for various reasons, can read with a vocabulary that covers the full breadth of what a system provides.


Most UX researchers today can back up their claims with empirical data.

HCI work in 1992 was very heavily based on user research, famously so at Apple. They definitely had the data.

I find myself questioning that today (like, have these horrible Tahoe icons really been tested properly?) although maybe unfairly, as I'm not an HCI expert. It does feel like there are more bad UIs around today, but that doesn't necessarily mean techniques have regressed. Computers just do a hell of a lot more stuff these days, so maybe it's just impossible to avoid additional complexity.

One thing that has definitely changed is the use of automated A/B testing -- is that the "empirical data" you're thinking of? I do wonder if that mostly provides short-term gains while gradually messing up the overall coherency of the UI.

Also, micro-optimizing via A/B testing can lead to frequent UI churn, which is something that I and many others find very annoying and confusing.


I there not any user testing as we know it today, mostly top down application of priciples.

This was all experts driven in that time to my knowledge.

Empirical validiton did not really take off until the late 00s.

https://hci.stanford.edu/publications/bds/4p-guidelines.html

Don had the explicit expert knowledge first stance in 2006 and 2011, nothing inherently wrong with that, but it's defenitly no research driven.

"Always be researching. Always be acting."

https://jnd.org/act-first-do-the-research-later/

Tognazzini and Norman already criticized Appple about this a decade ago, while the have many good points, I cannot shake the feeling that they simply feel like the were used to just brand Apple as user friendly in the 90s and that Apple never actually adopted their principles and just used it as it fit the company's marketing.

https://www.fastcompany.com/3053406/how-apple-is-giving-desi...

there are a bunch of discussions on this

https://news.ycombinator.com/item?id=10559387 [2015] https://news.ycombinator.com/item?id=19887519 [2019]


That's interesting, I hadn't heard that point of view before.

Empirical validiton did not really take off until the late 00s.

https://hci.stanford.edu/publications/bds/4p-guidelines.html

Hmmm, I don't quite see where that supports "Apple didn't do empirical validation"? Is it just that it doesn't mention empirical validation at all, instead focusing on designer-imposed UI consistency?

ISTR hearing a lot about how the Mac team did user research back in the 1980s, though I don't have a citation handy. Specific aspects like the one-button mouse and the menu bar at the top of the screen were derived by watching users try out different variations.

I take that to be "empirical validation", but maybe you have a different / stricter meaning in mind?

Admittedly the Apple designers tried to extract general principles from the user studies (like "UI elements should look and behave consistently across different contexts") and then imposed those as top-down design rules. But it's hard to see how you could realistically test those principles. What's the optimal level of consistency vs inconsistency across an entire OS? And is anyone actually testing that sort of thing today?

I cannot shake the feeling that they simply feel like the were used to just brand Apple as user friendly in the 90s and that Apple never actually adopted their principles and just used it as it fit the company's marketing.

I personally think Apple did follow their own guidelines pretty closely in the 90s, but in the OS X era they've been gradually eroded. iOS 7 in particular was probably a big inflexion point -- I think that's when many formerly-crucial principles like borders around buttons were dropped.


Like the whole recoverability paradigm, seems more like a feature from developer perspective looking for a reason to exist, than a true user demand.

You have state management for debugging purposes already, so why not expose it to the user.

As an example in photoshop no non-professional users care about non-destructive workflows, these things have to be learned as a skill.

Undo is nice to have in most situations, but you can really only trust your own saves and version management with anything serious.

Sonething as simple as a clipboard history is still nowhere to be found as built in feature in MacOS, yet somehow made it's way into Windows.


Why is it highly implausible on its face other than the fact it makes arguing against him harder?


Why would UX be getting worse across the board if there is greater understanding now?


Did you mean to reply to me?

The person I replied to said, "that interface designers or even the average computer user understands more than in 1992 is highly implausible on its face"

Think of computer users at the ages of 10, 20, 30, 40, 50, 60, 70, and 80 in 1992. For each group, estimate their computer knowledge when they sat down at a computer in 1992.

Now do the same exercise for the year 2026.

How is it highly implausible on its face that the average computer user in 2026 understands less than the average computer user in 1992?


> Did you mean to reply to me?

I think so.

> The person I replied to said, "that interface designers or even the average computer user understands more than in 1992 is highly implausible on its face"

Yes, I agree with this person.

>How is it highly implausible on its face that the average computer user in 2026 understands less than the average computer user in 1992?

I don't think it is. Particularly with the average user, the bar of understanding is lower now.


> Particularly with the average user, the bar of understanding is lower now.

Can you explain how this is true given that everyone using a computer today has had a lifetime of computer use whereas in 1992 many people were encountering computers for the first time?


Here are a few perfectly acceptable explanations.

1. Computer users were generally well-educated, unlike today.

2. UX designers didn’t inherit any mess and could operate from first principles.

3. The “experience” of modern users—phones, tablets, and software that does everything for you—doesn’t translate the way you think. And it explains why Gen Z seems to have regressed in terms of tech knowledge.


> Can you explain how this is true given that everyone using a computer today has had a lifetime of computer use whereas in 1992 many people were encountering computers for the first time?

The userbase has been watered down with a larger proportion of individuals who are not highly technical.


Oh that statement is so 1992. Millions of people getting a Dell or a Gateway and annoying their techie friend “So now what do I do with this?”

Or 1982.

Users are always non-technical.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: