Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
I Hate the Term "Modern" (paritybit.ca)
33 points by wild_egg on Aug 28, 2024 | hide | past | favorite | 70 comments


> The word modern has somehow become associated with “something better”…

Well, that's because modern typically is better.

Humans don't like problems, issues, inconveniences, accidents, or flaws. So many billions of us are constantly engaged in the enterprise of improving things. Or trying to, at least.

Of course, this requires trying new approaches to create "modern" things. Trying new approaches is the definition of experimentation. And experimentation produces failures, usually more often than successes. So many modern things are in fact worse than the old things. But some new things are actually good, and these good things tend to rise in popularity and last.

What's tricky is that even these modern+good things often seem bad at first because, well, they are kind of bad at first. They're too new to have had years of refinement and innovation. As the saying goes, when you invent the ship, you invent the shipwreck. But this doesn't mean you throw away ships. It means you keep iterating on ships to make them better than previous generations.

This is the process of modernization: Experimenting with a smorgasbord of new stuff, most of which sucks, but some of which is good, and yet still requires years of improvement to become great.

The good things from the past that the author is praising were themselves modern at some point. Today they're just further along in the process of being refined and improved.


This is a helpful analysis, but it’s also worth mentioning that new things can be created and adopted for reasons other than being in our long term benefit.


Humans do like to explore the possibility space, and there is a prevailing direction.


This is a wise perspective


My assumptions whenever I see a program that advertises itself as modern.

1. It will have more dependencies than lines of code. Opening a source file to see how it works will reveal nothing, because there's nothing there.

2. It will have at least one annoying feature that can't be turned off, like autosaving the file every 10ms, which will completely tank performance on a network filesystem or any other storage slightly slower than the developer's top end workstation.

3. It will have at least one hard coded safeguard to save you from yourself, like completely refusing to save files to /tmp, because then you might lose work.

You will run into limitation #3 while trying to workaround problem #2, at which point you will discover development philosophy #1.


I agree.

And when a UI is described as "modern", you can be pretty sure that you're going to be cursing it if you actually use the thing.


and it optimizes for looking pretty in screenshots rather than information density and looking cluttered


I'm cracking up reading this article because in Architecture (buildings not software) the word "modern" is basically used incorrectly as a rule. The "Modern" period started like 100 years ago, and yet it has somehow come to mean (tacky) high-contrast, open layout homes, only suitable for Mediterranean climates.

The author is totally right though, software people are subject to fads probably worse than the general public and second only to teenagers. This trendiness is rarely useful since we are constantly re-inventing the wheel and more effective solutions tend to take longer to win-out


to be fair a lot of Modernist buildings are not particularly climate appropriate either. Most of the Frank Lloyd Wright buildings have major leaking issues, for example.


> software people are subject to fads probably worse than the general public and second only to teenagers

A-FRICKING-MEN. It's exhausting.


I always subconsciously interpret the word 'modern' as 'in fashion,' probably because the word 'mode' means fashion in my language. I don't associate it with something good or advanced, but rather with something vain and ephemeral.


"Modern" as in "not repeating old mistakes" is good. It does not guarantee from making new mistakes, of course.

"Wow, modern" as in "very new, shiny, and surrounded by hype" is usually bad, because the emotions overshadow a rational view. This leads to incorrect assumptions, excessive optimism, and often to enthusiastic but incorrect application of the new and somehow worthy thing.

Improvement is good, fashion-driven hype, bad.


C'est donc à la mode, c'est-à-dire, avec de la glace.


TFA should be arguing for use of the phrase "du jour" instead.


"Modern" is 1920s-1960s.[1] We're now in post-post-modern, or something like that.

[1] https://www.youtube.com/watch?v=GOjglFG2XAc

[2] https://en.wikipedia.org/wiki/Modern_architecture


That's the "Modern" cultural movement.

The "Modern" Age instead ended at around the 17th or the 18th century.

Naming anything "modern" is pure hubris.


The fact that there's confusion on how to use the word "modern" is why I dislike it.

When people say they hate "modern" art they usually mean contemporary art.


or just modern, because modern/modernist architecture is different from contemporary


I was saying most people tend use the term incorrectly, not that they're the same.


In European history, the early modern period is roughly 1500 to 1800, and "modern" without qualifiers usually refers to the period starting from ~1800.

In other fields, "modern" may have a specific meaning due to historical baggage, or it can simply refer to the present day. The latter is basically the literal meaning of the word.


Actually the modern period began in the 15th century; one common starting date is the fall of Constantinople to the Ottoman Turks in 1453.


I have a similar feeling towards "Legacy." Aka "Everyone who wrote it doesn't work here anymore and I can't be bothered to learn how it works"


Why are you using that heirloom framework instead of something post-modern?


I heard a story from the 1990s. Oracle was going around to encourage people to write their domain-specific components as a Data Cartridge. Their evangelist pissed off the owner of a local software company by casually referring to how customers port 'legacy' software to Oracle.


> For example, I have encountered situations where others would look at the tools that I am using, such as Neovim [...] and say things like “Why not use a modern editor like VS Code?” or “Why not use a modern web interface for your email?”.

VC Code will one day be the old legacy editor. Everyone on the modern train will be obliged to change their editor to the new modern one, multiple times. Some people reading this are probably thinking "but VS Code has so much development behind it that it will stay relevant"... this is exactly what users of "premodern" tools are saying now.


Funny that you should say that. Just met with a group of very senior engineers today and two of them said they weren't using VS Code any more. Instead they used:

1. https://www.trycursor.com/

2. https://zed.dev/

I'm still on VS Code myself. Cursor at least is just a fork of VS Code with AI features, so you can still use VS Code extensions, so it's something I might try at some point.

Zed sounds cool (it's fast; the guy who used it said it made him feel like he was programming in the 90s again, and I know exactly what he means), but I love having all the extensions (a quick search finds no Tailwind extension for Zed, for instance, and I'm really loving the Tailwind autocomplete). Might give it a try at some point, but I doubt I'll change to it.

Thing is, I was one of those people who kept jumping to new editors. Every single time it was because the current editor had a serious problem of one flavor or another--and another editor solved that problem.

VS Code is likely to be Good Enough for a long time. Zed might gain some adherents like the guy I met today, but I didn't keep trying new editors because I was "on the modern train," but because every other editor sucked in one way or another. All the vims and emacs variants suck. This shouldn't even be a debate, to be honest, but they do suck.

If VS Code keeps on its current trajectory (i.e., Microsoft doesn't abandon it), it will likely be the Good Enough editor perpetually. It's already 9 years old and it's still the "modern editor" of choice. And when I suggest people try a modern editor, I didn't even really mean VS Code but really any editor that's modern. Sublime, Atom, Visual Studio, any of the JetBrains editors...anything modern is better than vim or emacs. But some folks trained their fingers and don't want to change. Nothing you can say to them to convince them.

As another comment points out, Shakespeare is written in "modern English." It's not about chasing the "modern train" as much as not continuing to use Atom or Neovim when everyone else is using VS Code, and VS Code has become the common IDE of the work environment, is already set up to debug the app everyone is working on, and has all of the extensions everyone needs to be on the same page.


I’ve been using BBEdit, since the 1980s.

It’s still being updated.


"Modern" has several wibbly wobbly definitions in various contexts, but with regards to languages it is actually quite specific:

"denoting the form of a language that is currently used, as opposed to any earlier form. e.g. "modern German"

Yes, Shakespeare is modern English. Deal with it.

With respect to computer applications, I'd support a similar use of the word. There are plenty of languages and applications that are old but still actively used by many people. That means they're modern even if, like Shakespeare, they are outdated in some respects.

It's arguable that computing has evolved so quickly that the line between what is modern and what is not might be set more rigidly, but what year would that be? Many people are still using Fortran because it happens to be really good for certain things. Meanwhile, there are languages and tools that are much more recent that nobody is using because the were completely superseded by something better. Can something made just a decade ago not be modern, just because it's not currently in use?


I hate the word "technology," as in, "Our new framistan is made with foobar technology!"


A while back, I developed a mental habit of replacing the word "technology" with "thing" in my mind.


For a year or so, during the peak of cloud computing hype, I had a little browser extension installed which provided a great deal of puerile amusement by replacing every instance of "cloud" with the word "butt".


A phrase I hate even more than “modern” is “x is the future”, especially when used to describe a technology that only works for certain use cases.

It’s an attempt to use social pressure and bullying tactics to enforce a technological consensus. I particularly dislike it when it’s used to dismiss legitimate concerns with missing features in new technology instead of acknowledging room for growth and development and the real diversity in user needs.


"x is the future" is a sales pitch, nothing more, and can therefore be safely and completely ignored.


I see it used more often by hobbyists interested in crowding out alternatives they don’t like. “Why are you using snap? Flatpak is the future!” “What, you’re still using Xorg in nvidia? But Wayland is the future!

And I like flatpak and Wayland! But the contexts I’m discussing aren’t “We’re creating the future of x at Company Y”… of course that’s a sales pitch that can be ignored. But there is a context where it’s more like bullying.


> I see it used more often by hobbyists interested in crowding out alternatives they don’t like.

Yes, that's what I was referring to.


--can therefore be safely and completely ignored, until a large number of people believe it.


The chances of the statement being complete bullshit nears certainty in most contexts today, so ignoring it as an argument would be a great course of action as such. Unfortunately the person-like thing making the statement, or its backend organization, often has leverage to make it a pain continuing to use the existing, functional, sane, tasteful solution/tool/feature/model/etc. For that reason the threat must often be taken seriously as an attack on the existing order.


I thought x was the past and wayland the future!


It may or may not bother people less|more [1], but "post-modern" gets some github love: https://github.com/search?q=%2Fpost-modern&type=repositories... including a command I use hundreds of times per day written in Nim: https://github.com/c-blake/lc

But it looks like someone just flagged this article or some such as it went from rank 16 to 116 in like 1 minute. So, oh well.

[1] https://en.wikipedia.org/wiki/Postmodernism


Postmodern to me has strong negative connotations, unless it is being used ironically.


I'm not sure if it counts as "ironic", but personally I almost always do mean it "jokingly"/non-"seriously".

If you read that wiki link ref'd above, it seems few claim to know what it's "supposed to mean" anyway.. So, it's hard to be actually sarcastic with any such adjective.

There are like 600 projects on that github search page with which one might try to assess developer feelings more broadly, I suppose.


The two examples that make my blood boil are:

1. “Modern web app”, also known as a polyglot monstrosity with ten thousand transitive dependencies of which at least a few dozen were written by kids from Eastern Europe now pushing up sunflowers somewhere along the Donbas front.[1]

2. “Modern auth” also known as designed-by-committee standards documents that no human being can fully comprehend because it is almost entirely written with just six words, most of which are synonyms in the English language.[2]

[1] This was the specific reason that forced an enterprise customer of mine to rewrite most of a newly developed Angular app.

[2] Paraphrasing only slightly: “Token code cookie authentication ID code token code.”


> Really what I’m focusing on here is the second definition and its misuse as a term to justify derision toward programming practices, languages, and software solely based on the age or appearance of said practice, language or piece of software.

Get off my lawn!

I have a technique that is a hideous chimera of very "modern" stuff, like async/await, extensions, prototypes, first-order functions, etc. (Spoiler: They aren't actually that "modern"), and very old techniques, like Structured Programming, backwards assignment, and hand-crafted assertions.

WFM. YMMV.


This blog post is like 5-7 years late. For me, the term "modern" when used by software developers means "avoid". It is useful for filtering out garbage.


It makes sense in enough contexts that I like it. The context-specific meaning is a moving target, though. For instance in 2005 a modern web framework would give you URLs without file extensions. In 2008 a modern JavaScript framework wouldn't modify prototypes of built-in objects. In 2022 a modern node.js library would support ECMAScript modules. Just because the context-specific meaning changes doesn't mean the overall meaning changes.


Agree. I hate it because it implies nothing except recency, which is fleeting.

If I want to learn "modern programming", should I buy this book? https://www.abebooks.com/servlet/BookDetailsPL?bi=368626334

(it's from 1987)

"modern c++" has represented wildly different approaches to programming over time. Compare 2003's "modern c++" to now.


I just think it's overused and lazy. Describing your project as "modern" doesn't tell me anything useful. What's modern about it? What's the "current style" that it adheres to? Is it lightweight? Modular? Distributed? Functional? Typesafe? Test-driven? Stateless?

Whichever it is, just use that word instead! Tell me something specific about what makes it worth considering over existing "old fashioned" options.


I appreciate the term 'modern', it's basically saying newfangled for better or worse depending on how you view recent styles. Simply don't equate modern with better, or only better in certain ways that you may or not agree with. For others who insist that modern is better, don't listen to them--don't try to correct the internet.


The related annoyance is to call things “legacy” when you’re introducing a new thing. I remember a Microsoft meeting where someone was introducing yet another database API or some such “innovation” and kept referring to “legacy systems”. One of the other architects dryly said “just to be clear, by legacy system you mean every existing system?”


It’s just pragmatic to brand your work as modern. “Contemporary” is harder to type and too complex.


In the context of software marketing, "modern" is a synonym for "smart" and "fast". They all mean "better but actually worse".


Contemporary is a better term. Less loaded.


"Universal" is also a term that doesn't typically engender happiness in technologists.


Universal has been part of the definition of modernism. One can always critique a modern effort on the scale of universality.

Modernism grew out of post world war 1 optimism for a universal humanity.


Modern programming language like Rust


What I find offensive about this post is using "neo"vim and neomutt. what, was original AgentSmithMutt and TrinityVim not good enough? ...Is it because they're old?

Well anyway, I mostly agree with this post. Another similar thing I hate to see is things like "Really? In 2024?" and im just like that's the laziest critique of something ever.


neo means new


so does modern. so the same author who hates the term modern uses neo - synonomous with modern for all practical purposes - vim and mutt. why is that? might those reasons also apply to other "modern" things?


If you're forking a software to introduce change… it is factually newer.


Yes, call it current instead. It's not going to be modern for long.


Title needs 2020 :-)


> Just because something is recent, newer, or shinier does not make it automatically better.

Sure, but that's just your strawman. It's better because learning from the mistakes of the past leads to the avoidance of some of them, which happens even in computing. While you conceptually can easily male new mistakes it's a tough sell to reject progress altogether by decree

Otherwise hard to argue more specifically since there is nothing more specific in the post when extolling the virtues of the bad old ways:

> In fact, in this particular case I’d argue that the Git email workflow is far superior to the pull request workflow for a multitude of reasons, as I have so often said in the past.

A link to that blog would have been nice


I used to criticize every hacker news post's use of the term 'modern' and was downvoted every time.


I couldn't disagree more.

Paradigms in computing change. Modern means something uses paradigms that are best for today.

E.g. modern means using a JSON API rather than XML, or vanilla JavaScript rather than jQuery, or a responsive web design rather than separate fixed designs for desktop and mobile.

Modern, in this context, is meaningful and important to be aware of.

Of course, just because something is new doesn't make it better. But when I see "modern" in the description of a project, it usually does seem to refer to the paradigms that are the most useful for today. It's not just some kind of fetish for whatever's new, which is what the author seems to believe.


Modern in the design sense should refer to the current thinking of the state of the art at inception. The individual gets to choose whether or not something fits their modern sensibilities.

Timeless design taps into sensibilities that transcend popular trends


I'm having a hard time understanding what you've written. The design sense as opposed to what? At the inception of what? What are "modern sensibilities" of an individual as opposed to a culture?

In any case, there's no such thing as "timeless design" in computing, as it's evolving far too fast. Does timeless design mean interfaces that rely on keyboards but not mice? Or mice but not trackpads? Or trackpads but not touchscreens?


The term Modernism came from the art world. If I see it used I assume we are discussing the values that defined modernism.

I have doubts that there are are no timeless designs in computing.


The term "modern" under discussion in this article has nothing to do with Modernism in the art world. It's just the regular dictionary and marketing sense.

What would you suggest is a timeless design in computing, then? That we can see holding up as an ideal solution in 1975, 1995, 2015, and today? Because I honestly can't even think of one, and I'm genuinely trying.

I mean, there are basic principles of "don't repeat yourself" and organizing code into functions, but those aren't aspects of design.


The first one I thought of was Claude Shanon’s contribution to computing and the second was expressed even earlier by Charles Babbage.

One can disagree but but it’s the differences that shed light on things.

I don’t think you are equipped to discuss design so I will leave it here.


> I don’t think you are equipped to discuss design so I will leave it here.

Please don't be insulting. I could just as easily say the same about you -- I can't imagine anyone talking about Shannon's contributions as falling in the category of design -- but please let's not make things personal.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: