not quite in the same area, but this advice reminds me of blizzard and world of warcraft. for years and years, people requested a "classic" WoW (for non-players, the classic version is an almost bug-for-bug copy of the original 2004-2005 version of the game).
for years and years, the reply from blizzard was "you think you want that, but you dont. trust us, you dont want that."
they eventually caved and launched classic WoW to overwhelming success. some time later, in an interview, ion hazzikostas (the game director) and holly longdale (vice president & executive producer), admitted that they got WoW classic very wrong and that the people "really did know what they want".
anyways, point being that sometimes the person putting in the feature request knows exactly what they want and they have a good idea. while your default mode might be (and perhaps should be) to ignore feature requests, it is worth recognizing that you may be doing so at your own loss. after all, you might not not be able to fully understand every underlying problem of every user of your product -- but you might understand how to code the feature that they asked for.
It takes real courage for a builder to say, "It’s good enough. It’s complete. It serves the core use cases well." If people want more features? Great, make it a separate product under a new brand.
Evernote and Dropbox were perfect in 2012. Adding more features just to chase new user growth often comes at the expense of confusing the existing user base. Not good
So when I first started dealing with the actual code, it scared me that the standard json library was basically in maintenance mode for some years back then. The standard unit test framework and lot of other key pieces too.
I interpreted that as “Java is dying”. But 6 years later I understand: they were are feature complete. And fast as hell, and god knows how many corner cases covered. They were in problem-solved, 1-in-a-billion-edge-cases-covered feature complete state.
Not abandoned or neglected, patches are incorpored in days or hours. Just… stable.
All is quiet now, they are used by millions, but remain stable. Not perfect, but their defects dependable by many. Their known bugs now features.
But it seems that no one truly want that. We want the shiny things. We wrote the same frameworks in Java, then python the go then node the JavaScript the typescript.
There must be something inherently human about changing and rewriting things.
There is indeed change in the Java ecosystem, but people just choose another name and move on. JUnit, the battle tested unit testing framework, had a lot to learn from new ways of doing, like pytest. Instead of perturbing the stableness, they just choose another name, JUnit5 and moved on.
But I've noticed the products I actually keep coming back to are the ones that feel opinionated. They decided what they were and stuck with it. The ones that try to be everything usually end up being mediocre at all of it.
The WoW comparison in this thread is apt. The early expansions had clear identities. The later ones kept bolting on systems until playing felt like managing a spreadsheet.
The result: very few features. Which is exactly what I want.
The amount of hacking required to even be allowed to re-associate text files with that particular exe on Win11 was shocking to me. I get that windows is extremely hostile to its users as a general policy, but this one felt extra special.
But also, most of the modern software is in what I call "eternal beta". The assumption that your users always have an internet connection creates a perverse incentive structure where "you can always ship an update", and in most cases there's one singular stream of updates so new features (that no one asked for btw) and bug fixes can't be decoupled. In case of web services like YouTube you don't get to choose the version you use at all.
Only 15 years later - now a couple of years back - I started to realize the importance of this. The apps on my smartphones seemed to get slower and slower by the year. The fast software experiences were a real joy amidst slow apps. I now have an appreciation for their opinion on 'evergreen' things like the speed of software.
ls: usage: ls [-@ABCFGHILOPRSTUWXabcdefghiklmnopqrstuvwxy1%,] [--color=when] [-D format] [file ...]
I don't think it knew when to stop.
There are great exceptions to this rule, even in paid software, where the authors are significantly poorer in exchange for producing better software. I imagine the authors of BeyondCompare or Magnet (for example) could have done a lot better financially for a while using a recurring license model.
There are also really stupid applications of this rule, such as what has happened with AutoMapper and MediatR in the last year or so, where the only meaningful commits since going commercial are the bits that check your license and try to fool you into paying :/
It seems like this shouldn't be a problem. It often only takes one developer willing to make a sacrifice to make a particular class of software available that actually attempts to solve the problem and nothing more. But in reality what we see is over time those developers that did make a stand start to look out for themselves (which I have no problem with) and try to take what they can while they have market share.
How do we find a way to live in a world where developers can build useful things and be rewarded for it while also preventing every piece of software from turning into shit? I'm not sure what the answer is.
The best codebases I've worked with share a common trait: they have clear boundaries about what they don't do. The worst ones try to be everything and end up being nothing well.
This applies doubly to developer tools. The ones that survive decades (Make, grep, curl) do one thing and compose well. The ones that try to be platforms tend to collapse under their own weight.
Specifically he rolled out a "cave" system with procedural dungeon generation where players could mine through walls and other advanced systems, then undid all of it and ended with ~30 static layouts and very simplistic interactions. The entire game feels like a demonstration that simple, predictable and repeatable interactions with software have more longevity than cutting edge dynamic systems.
Even `ls` gets news flags from time to time.
I think "stopping" is great for software that people want to be stable (like `ls`) but lots of software (web frameworks, SaaS) people start using specifically because they want a stream of updates and they want their software to get better over time.
It grows and grows and eventually slows or grows too much and dies (cancer), but kinda sheds its top-heavy structure as its regrown anew from the best parts that survived the balanced cancer of growth?
Just forks and forks and restarts. It's not the individual piece of softwares job (or its community's) to manage growing in the larger sense, just to eventually leave and pass on its best parts to the next thing
> It predicts which ones you meant.
> It ranks them.
> It understands you.
This is so good I want to know whether someone generated this or wrote it by hand.
When chatGPT first gained traction I imagined a future where I'm writing code using an agent, but spending most of my time trying to convince it that the code I want to write is indeed moral and not doing anything that's forbidden by it's creators.
is to begin naming;
when names proliferate
it’s time to stop.
If you know when to stop
you’re in no danger."
agree with this point. new developers should care about this.
It is "their" distribution, to do with as they wish. If this would happen to your workstation, you are a fool, for not following release notes.
I already jumped distros for several reasons, marketing BS was one of them. I do not need latest scam or flag of the month!
> Say no by default — every feature has a hidden cost: complexity, maintenance, edge cases
AI-assisted development is blowing up this long-standing axiom in the software development world, and I am afraid it's a terrible thing.
Just because you can do something, doesn't mean you should.