It's odd to see an article emphasising backwards compatibility, but not a single mention of Microsoft (I even ctrl+F'd the page source to check!) They've regressed a bit in the recent years but I'd still consider MS the gold standard for back-compat. I have Win32 binaries I last modified over 15 years ago, small (or perhaps tiny - size measured in KBs) utilities that I use daily. They worked perfectly on Win95, and still do on Win10.
I wonder when/if the software industry will ever stabilise. It's possibly the only industry where frequent and disruptive change continues to happen, and is even welcomed by many in it (not me).
> I’ll also give a shout-out to our friends in the Operating Systems business: Windows, Linux, NOT APPLE FUCK YOU APPLE, FreeBSD, and so on, for doing such a great job of backwards compatibility on their successful platforms.
I share your desire for stability. I don't necessarily mind frequent updates for security reasons, but I'm getting to the point where if things are changing in big ways often, I'm just not going to use it --- I no longer find enjoyment in running on the upgrade treadmill, it's just work.
That said, I don't see a lot of hope for our outlook. Most people don't seem to have a concept of completable software. It would be nice if it were bug free, but getting to the point where the cost to fix bugs compares to the cost of leaving bugs as known issues is doable.
Linux code might be backwards compatible, but nothing like windows. I can run pbrush.exe taken from Windows 95 on windows 10, and it works fine. Want to run a 10yo binary on Linux? It's easier to set up a VM with a 10yo Linux release on it. You've hot the source code and you can compile it? Well, tough, the build tools have changed, and some autoconf script has been removed, even from autoconf-archive.
Actually, I've kept an ancient statically compiled Mathematica (linux) binary from 2000 and it actually still runs. I had to make some symlinks since X11 font paths have changed. That's a full GUI app with a sophisticated internal kernel.
This should be generally true of any well-compiled static binary on linux from 20 years ago.
OK but he talks about how Android OS has had backward compat for over ten years, but Windows has been doing it far longer and with a much bigger OS and more complex hardware base.
"completable software". that's the ticket. the maturity to say no more features will be accepted; it is done.
for the past 20 years we have been teaching the agile way, that points to the constant churn of requirements, the pivots, and the idea there is always another sprint
The absolute worst in my small experience with it is anything touching the node ecosystem. Try to follow any guide or use any library not updated in the last week and there will be breaking API changes and deprecation warnings everywhere.
I'm running into this problem more and more. It seems like Node is fundamentally unusable for anything but the nimblest teams that do not mind updating their dependencies automatically, every week, and when something breaks, they are capable of fixing it within hours.
I really like Node as a language (especially with TS), but Node as an ecosystem feels like a very hard fit into 90%+ of corporations.
I think the culture of Node comes from the earlier culture of in-browser JavaScript. When running on the browser platform you have to make sure your libraries are up to date and maintained, otherwise you're at the mercy of browser updates breaking things from under your feet (this is less bad now, but it was awful in the days of IE<9, and on android before they started shipping auto-updating Chrome webview)
Consequently, JavaScript developers got used to the burden of maintenance and rapidly updating to the latest versions of libraries, and this culture carried through to Node (partly because a lot of libraries are shared between node and the web).
Having said that, if you pick your libraries well, it's not too bad these days. When I upgraded from Node 12 to Node 14 earlier this year, I had to upgrade the `pg` package to a newer version that supported Node 14 (there was one available), but I didn't have to make any code changes. And other than that I've had no forced version upgrades in a long time.
I guess if you're lookig at 5-10 year timescales with literaly no maintenance then this would be a different matter though.
I strongly feel that you must have end to end tests while using node because of the dependency hell. Not even knowing if a upgrade of a dependency breaks your system is just hard. Also testing it by hand is just not maintainable.
There are massive differences in how much of a problem this is across different platforms.
With Clojure, I can think of 2 times ever when a dependency caused an issue. It was extremely obvious since the issue was "won't compile", and the fixes were simple.
With PHP, I expect any change to potentially break something. Bump your AWS SDK which uses a different minor version of guzzle? Fatal error.
There's a world of difference between breakage being an everyday thing and a true rarity.
This is only a problem in some parts of the ecosystem - but they are the parts that most 'tutorials' are written about, because they look shiny and fancy. If I'm blunt, they're the parts of the ecosystem that HN loves to gloat about in their "how I saved my company thousands of dollars by building a todo app with GlamorousLibraryJS" type posts.
The short answer: if you avoid the shiny tools that claim to do everything, you will not have this problem. And as a special case, avoid Gulp, which is mismanaged.
The single-responsibility libraries that have a well-defined scope, on the other hand, have often been sitting on npm unchanged for 5-6 years, because they are simply done, and they do what they need to. They will very likely never deprecate anything, as there is simply nothing to change.
I would argue that these libraries are actually doing a better job of stability than their counterparts in other languages.
Edit: An additional factor is that it's easy to write a tutorial about an extensive framework with a large scope; there's plenty of stuff to write about. Writing a tutorial about "this is how you make a function call to this library to parse a geo URI", on the other hand, probably isn't going to happen.
So if you follow third-party tutorials, you are naturally going to end up at the packages that are most prone to deprecations.
The sad part about the js ecosystem is that even the biggest corps like Google can't keep their own js ecosystem stable on the last version. I have a small Angular app with Firebase backend and every time I come back to it to update, the libs are out of sync
Google is notoriously bad at maintaining (or even designing) their open-source stuff. I wouldn't say "even the biggest corps like Google" -- Google does considerably worse at this than many hobbyist maintainers! I just don't use Google libraries anymore, if I can at all avoid it.
See also my other comment[1] - Angular is an example of such a magical does-everything framework.
We've abandoned an app because of the inability to keep Firebase working on it (JS libraries). Between out of sync Typescript definitions, broken APIs, broken interfaces, "this version of some other dependency works with this version of Firebase thanks to some random transient dependency", it just became too much. We won't use the JS libraries at all anymore.
Angular is the only one that has remained even remotely usable, but certainly not stable. It's never as simple as running the upgrade utilities, changing the version, and being done. It ALWAYS takes at least a day to find all the "little" things they didn't feel fit to mention in the upgrade docs.
It's aggravating. I genuinely don't like using Google software, at all.
I've felt for a long time that Google is coasting on the momentum of the early web and the awe people felt for what they built early on. That hasn't been Google for a long, long time.
I went googling for some Raymond Chen examples of the great and terrible things Windows does in the name of back compat, and came up with this old thread: https://news.ycombinator.com/item?id=13450160
When IBM starts to say they are going to deprecate something in mainframe, they will deprecate it in probably 10-12 years time. There will be 3 versions in between and each version will have a much better migration path. If a corporate don't want to take risk, they will be mostly the last ones to migrate, after confirming &consulting all the IBM tech champions, they would start the migration process which would last a year, before they install the new version.
I think NetBSD, or any other BSD generally, is also pretty good with backwards compatibility. I run programs well over 15 yrs old using the latest kernels. If I need to, I can build any of the older kernels, too, because all the code is still readily available (=gold standard). Earliest one predates Win95.
Your Win32 binaries sound better than whatever software is being distributed by the companies today.
The popularity of Go suprises me when even relatively simple programs measure 10-20 MB or more. It is like Java. Is it even possible to write Go programs measured in KB?
I wonder when/if the software industry will ever stabilise. It's possibly the only industry where frequent and disruptive change continues to happen, and is even welcomed by many in it (not me).