I'm more and more convinced that a language is only as good as its ecosystem. Yes, language features and ergonomics do matter and are what initially draw people, but in the long term without a robust set of supporting tooling, well designed libraries, and a significant standard library, the language will eventually decline.
Neither Java nor Python are incredible languages as far as baseline features and semantics are concerned, but they continue to thrive because their ecosystems are fantastic and they both have really rich standard libraries to start from. Contrarily, quite a few languages I love from a language design perspective fail to gain as much traction for lack of well-developed ecosystems.
You're making a mistake that many people make: treating the language and the ecosystem as two (almost) orthogonal things.
But in reality the language has severe impact on how the ecosystem can and will develop.
For instance, when looking at statically typed languages, it makes a huge difference if they offer a solution to the expression problem. That allows two author's to write two completely separate libraries (e.g. a library for an http server and a library for json-(de)serialization) which don't know anything about each other and then allows a third author to write another completely separate library that connects the other two libraries together, so that the http library can use the other library for json.
The end user then just has to important all three libraries and has compile time guarantees that it works.
If the language does not support that, then people will find workarounds. But it is still heavily impacting the ecosystem.
> But in reality the language as severe impact on how the ecosystem can and will develop.
This so much.
One, often omitted, property is also simplicity of the language. If the language is straightforward, then there's a much larger audience who can be potential contributors.
Being opinionated in code style is also if value, as it's easy to contribute to others' libraries.
It's one of the reasons I think why Go did quite well as far as quick library ecosystem creation goes. It's fairly simple, at least as far as the basics go, and diving into/contributing to third party libraries is very easy, due to opinionated tools like gofmt and the overall language simplicity.
Whenever I need something in a library that's not supported yet, I'm able to achieve it in less than a few hours, by quickly diving in and making the changes, even in an unknown codebase. The magic is that this works for the vast majority of libraries.
> One, often omitted, property is also simplicity of the language. If the language is straightforward, then there's a much larger audience who can be potential contributors.
With regards to "simplicity" people often seem to confuse syntax vs semantics. They mistake complex semantics for complex syntax. For example people say that Rust has complex syntax, but actually what's happening is that Rust is trying to represent very complex semantics which necessarily results in a complex syntax. This allows more choice for the developer in how things are implemented without extra cost. On the opposite end is python with very simple semantics but is highly opinionated on how things work underneath and thus is inherently incredibly slow for most any serious computation.
I really enjoy Nim for this. It’s also really easy to dive into most any library quickly.
Especially for implementations of cool algorithms where sometimes the only other implementation is a C++ version. There’s a cool sub-ecosystem of people who fiddle with algorithms in Nim. They’re often not even in the package manager, just someone who implemented some paper or whatnot.
There is not much point in “being simple”, as essential complexity has to live somewhere. I much prefer that living in some compiler-enforced construct than in hand-written conventions. Go is terrible because it doesn’t let you use the former way at all.
What I've seen with the current generation of languages with a lot of compiler-enforced guarantees is that they introduce a lot of needless complexity to the vast majority of problems, which are very often simple.
Overdoing monadic constructs is probably the biggest offender here, but Rust (even though I like the design overall) lifetime checking will usually also cost you way more time getting right than just using a GC (if you don't need complicated concurrency).
Main point being, it's not that easy to contribute to libraries written in these.
I believe that it is a different axis - GC solves the whole concept on a different level (runtime), but where they choose to not take those tradeoffs that complexity has to live somewhere.
Monadic constructs are just a convention though, I think it is different.
And those Python and Java ecosystems are the result of years of effort from their respective communities. When I first started learning Python, the question people would raise was “Why not just use Perl? cpan has just about anything you could want!”
A language needs to be good enough for developers to enjoy using it. Java was a breath of fresh air compared to 90s C/C++ code, so people were very motivated to build with it. Likewise, Python had a reputation for “Looking like pseudo code” which made it attractive to devs who needed a language that would let them build out ideas quickly.
> A language needs to be good enough for developers to enjoy using it.
That's too far. A language simply needs to not be actively bad.
Perl, unfortunately, was actively bad because everybody had their own subset of Perl that they knew and none of them overlapped.
When I was interveiwing for VLSI positions back in the mid 1990's, everybody would ask if you knew Perl. I would start whiteboarding and invariably would write something in Perl that the interviewer wouldn't know--which would effectively derail the interview. This is actively bad.
Python, by constrast, never had that problem. I used to carry to interviews printouts of a reasonably simple program in both Perl and Python (verilog VCD parser) so I could discuss them with the interviewer. The interviewer would always like the Python program better even if they didn't know Python.
(although, to be fair, it seems that Python has now grown that subsetting problem recently--how times change ...)
This is where TypeScript had a massive leg up, it obviously had access to the whole JS ecosystem. The community then created a type annotation catalog for popular libraries that didn't have them themselves.
"Modern JS" has adopted many of the best features from TypeScript, but back when it launched it added so much cleaner syntax and abstractions over JS.
Others that have followed a similar route to success are Scala and Elixir for the JVM and Erlang VM.
I use Nim everyday so I am based, but previously I used JS. And it felt like the JS libraries where just not good quality. It seemed NPM was full of old, broken and unmaintained libs. Some of them did trivial things. I was never happy with their quality. For me JS had the quantity, but not the quality?
With Nim, yeah there are way less libraries, but the ones it does have some how feel better. Also when there isn't you just write your own code specific to your missing use case. Which is not that not much code and fits better.
I think one of the biggest issues we currently suffer in the programming field is the use of libraries upon libraries, layers upon layers, without understanding.
> I think one of the biggest issues we currently suffer in the programming field is the use of libraries upon libraries, layers upon layers, without understanding.
I think there’s a balance here. We obviously are way better off with all the layers we’ve built, whether it’s compiler optimizations or kernels or VMs or runtimes or.. or..
We have some widely accepted abstractions and layers like “android” or “Linux” where we don’t question the underlying code too much and we just accept and appreciate it. They’re also heavily maintained for us. 20 layers of left pad? Not so much.
It’s almost the standardization problem. We need bigger and more feature-rich runtimes and standard libraries as we tackle more complex and bigger problems. But if every dev has to chart their own path through 1000000 combinations of libraries and runtimes and components, we can’t -as a society- maintain them and battle test them with the same rigor. Like containers, we’ve started to build established norms and tools (docker, oci, etc) freeing us from all the details, when the abilities were always there within Linux.
It's true there are a lot of low quality js libs, but there is also a vast number of high quality lobs too, you just need to search them out. From my experience, libs written in TS tend to indicate a higher quality.
The main issue with JS and Node was the number of small libraries that do one thing, and the over use of them. If you stick to a framework, Lodash (rather than lots of small libs) and any specialist libs you need it all works really well.
I don’t know, it is still way bigger than many commonly mentioned language on HN. Also, Scala 3 is a really great revamp, plus most of the community actually prefers pragmatic FP a la Li Haoyi.
Rust has a large ecosystem. Cargo, rustup and a big library. Rust stdlib isn't very big deliberately. Rust stdlib aims for a small and more portable standard library. That's why regex isn't in the stdlib.
I don't think large is a good thing. There was a recent reddit thread on rust where someone was trying to pick through 80something libraries for a bloom filter. The most downloaded one didn't have docs
I have one: scientific software. There's a growing recognition that Python is creaking under the weight of scientific applications that lean on it. It's very difficult to write efficient, maintainable, large-scale software systems with it.
An example that comes to mind is data reduction pipelines for the bytes coming out of a major telescope, like that of LSST. I worked on those pipelines, and a lot of the effort was around figuring out how to consistently build the dang thing given the looseness of Python's systems. An equally large amount of the effort was on trying to eke out the performance of Python via Numpy tricks, Numba, etc, which impinged readability and maintainability.
In some places, we fall back on a C++ core, but nobody is happy about that, for all the usual reasons. Some of us advocated for Rust... but it was never really plausible, because the interactions with Python numerical and scientific libraries are so immature. People are starting to work on it but it's hard to compete with the decades of work put into C++ and Python as the core of scientific stacks.
Yeah with things like are we web yet, rust presents it self as can do anything language.
Which sure technically it can, but atm I think it's a only appropriate when you have an army of developers at some money printing company that go and write all your dependencies
I’ve said it before and I’ll say it again, although I think it’s an unpopular opinion:
Rust introduces unsafe code (at a minimum through crates). The rust community hates that people use C++ when they could use a memory safe language, so why are they introducing memory unsafe to something that a GC works fine?
It’s like their opinion of memory safety as a necessity totally flips the second rust is the unsafe one.
Every language has unsafe bits if you look deep enough. CPython and Java's garbage collector are written in C, and we have ctypes and JNI, but I think most people would still consider those memory safe languages. The important question is: is there a well-defined safety boundary, somewhere between the CPU and your application code, where memory is guaranteed* not to be corrupted and the programmer is freed from worrying about low-level bugs like segfaults?
*assuming no outside bugs such as CPU bugs, bugs in the GC, or bugs in an unsafe block
This doesn’t make sense. So rust let’s you write unsafe code, and now that’s the same as any language that calls C in any way?
Why replace C++ then? After all, rust is just as unsafe by your definition.
Having a language that is default safe and can only be unsafe in rare situations is way better than one that lets libraries write unsafe code, which is better than a language with no safe boundary.
> Having a language that is default safe and can only be unsafe in rare situations
This describes both Python and Rust. They are default safe, with escape hatches (ctypes or unsafe blocks) for rare situations. Writing unsafe code in Rust should be as rare as using ctypes in Python.
> is way better than one that lets libraries write unsafe code, which is better than a language with no safe boundary.
Python lets libs use ctypes to poke at raw memory. Solution: avoid those libraries
Rust lets libs use unsafe blocks. Solution: avoid those libraries (and optionally use tools like cargo-geiger or forbid(unsafe) if you care enough
If you don’t seem a parallel between what you just said: “just use these tools to catch the unsafe parts of rust” and the C++ crowd saying “just use these tools or do these things to catch the unsafe parts of cpp” there’s no point in continuing this discussion.
Python ctypes are not at all as common as rust unsafe code and geiger only goes so deep in search for unsafe code if I remember correctly.
So your opinion is that safety holes are only important if the syntax used is "common"? IME unsafe Rust code in the wild is rare and there's a strong culture against its use. Meanwhile in Python, ctypes and C modules are very common -- just about all popular libs use C under the hood, numpy, scikit, django, etc. It's tough to do anything useful without it.
Does Python have any similar tooling for managing all that unsafe code? Or is the only option to hope and pray that the memory vulnerabilities are not too common?
You can say “python calls C and that makes it memory unsafe” but so does every language, including rust. It is well known GCs are memory safe, all those articles about memory safety push for GCs first, and rust second.
> About half of Python libraries in PyPI may have security issues
Pretty bad, right? I'll take the ecosystem with 265 issues over the one with 749000 issues any day.
> so does every language
Now you're getting it! Every language is unsafe, and always will be. They all run on a CPU with bits and bytes. The difference between Rust and Python is a difference in degree, not in kind. The only hope a programmer has is to constrain that necessary unsafety to be as small as possible. Rust handles this with unsafe blocks - they are very rare, easily greppable and easily auditable. Python has, well, nothing. It's built on a mountain of C, the interpreter, stdlib, and most libraries. The mountain is unsafe by default, and there's no way to decompose the problem to smaller pieces. This pile of unsafe C doesn't magically become safe just because you used a scripting language to call into it. (Just like using Python to run an external C program doesn't make the C program safe)
You came into this thread wondering why you get downvoted when you talk about this? It's probably because you're applying inconsistent standards to different languages. You decided Python is "safe", and when you find out about the C underneath, you ignore it and stick your head in the sand and make excuses for the memory vulnerabilities. Meanwhile you decided Rust is "unsafe", and use any rare counterexample to discredit the practical improvement over its predecessors. I believe this is the fallacy currently on display: https://en.wikipedia.org/wiki/Nirvana_fallacy
You didn’t even read the rust article. The python problems article aren’t specifically memory violations, my point was that rust claims to be “safe” and yet all those vulnerabilities were memory safety violations. Those wouldn’t be possible in python.
I would argue your the one falling into that fallacy. You decided because Python calls C, its unsafe and so is Rust.
My issue is not with that really, but the odd double standard. Comparing calling C code to being able to directly write to memory is just odd. It’s not at all the same in practice.
It’s like memory safety is black and white when you compare rust and C, but than all grey when it’s rust and anything else. Marking things “unsafe” in rust doesn’t magically make it better, and unsafe rust code is far more common than memory violations in GC languages.
But it’s clear you have decided you are much smarter, I don’t see the point in continuing this discussion.
Nit: Django and all its required dependencies are pure Python. (It does have an optional dependency on Argon2 for password hashing, though, which uses C.)
Looks like ctypes are used in gdal/geos, core/locks, the orcale db driver, and the test runner (but just c_int??). (And I'm ignoring auth where ctypes stands for "content types")
Huh, yeah. I think geos is the only one of these that's actually calling native code not provided by the platform (it calls a GIS library written in C++). The file-locking code calls into the Win32 API to do something that on Unix is provided by the Python standard library. The Oracle driver code is working around a bug in Cygwin. And the test runner isn't doing FFI at all; it's setting up some multi-process shared memory, and the API requires that a ctypes type be used for that, since you don't want to use regular Python types that can allocate at will (and might do so in the wrong part of memory) for that purpose.
The typical Django deployment uses none of these, but on the other hand, it does use a third-party database driver written in C, so I guess the point stands.
Isn't it obvious? The position is not against the use of code that might cause UB, it's against making that the default everywhere. In Rust, unsafe is opt in and can be encapsulated. That's the key bit.
And no I am not, you are saying that. I think python is way safer than either, you seem to think because you can crash a python program that means the memory vulnerabilities of rust are justified.
It's not just crashes. Python's ctypes API can do anything that unsafe Rust can do, including the parts that permit arbitrary code execution vulnerabilities.
> GCs do various resource clean up that mean certain bugs are not exploitable. Rust does not have that.
All of the work Python does at runtime with a GC, Rust does at compile time with the borrow checker. GC is orthogonal to the idea of memory safety.
> And you didn’t address my second point.
I don't know what you expect. It's a semantically empty statement and thus unfalsifiable. You can flip it around without changing the truth or the meaning:
> you seem to think because you can crash a Rust program that means the memory vulnerabilities of Python are justified.
C++ is pervasively memory-unsafe; a bug in any part of a C++ codebase can easily cause undefined behavior, such that you can't isolate the memory-unsafe parts and apply extra assurance techniques to them. Most modern garbage-collected languages do let you do this, as does Rust.
If are a data scientist and come from python take a look at nimpy, a great way to just import python libraries and use them! https://github.com/yglukhov/nimpy Numpy, pandas, pytorch all usable in Nim.
Nim is the ultimate glue language, use libraries from anything: python, c, js, objc.
I just started using nim but the glue aspect was the biggest draw for me. It was pretty easy to hook up a socket interface to a systemVerilog simulation & provide a simple device interface to expose to the SW and systems teams. Previously, I would have done it in python but it always ended up with a short lifespan due to its slow speed.
One unusual hurdle for Nim right now is that on Windows, many of the anti-virus vendors out there are overly aggressive in flagging as bad any executable that was written in nim [1].
Don't know if they've made much headway lately, but for awhile even the main installer was getting flagged, as well as tools like choosenim.
The guy needs better debugging facilities in Nim, so why don't guys discuss a way of organizing funding for better debugging facilities and stop going off on a tangent?
The inability of HN posts to stay on topic is pretty annoying.
The one thing that's really holding me back from Nim: the lack of an integrated debugger in Jetbrains' IDEs.
Yes, I can hold lots of state in my head; and yes, I can make do without a debugger, if I have to.
But I don't want to. Especially since I've been so spoiled with PyCharm and Clion's way of doing it. My debugging time is limited and valuable, and a debugger that quickens my inspection of a program is nowadays so essential to my workflow that I'd rather hold back from committing to Nim at all until its debugger tooling (at least to the level of that in Jetbrains' IDEs) is available.
I found Nim to be the best in compile-to-C class of language/tools from the 2010s.
The problem was that, it wasn't [possibly still isn't] available in default Fedora repo while other less popular languages were available. No idea why. Might be due to low adaptation.
Is that really a barrier? Nearly every system package manager contains extremely outdated versions of the languages I use (or none at all, like Golang or Java).
If the language is good, it won’t be very hard to install it without a package manager.
choosenim has two bullet points as its raison d'etre (quoting from its Readme):
- Provide an easy way to install the Nim compiler and tools.
- Manage multiple Nim installations and allow them to be selected on-demand.
The first one I already get with the package manager I use for everything else. The second one I don't really care about as a normal user: I just need the one version at a time. As much as possible, I would rather stick with one way of managing tools rather than having sub-managers that I interact with infrequently and have to remember each time. (The same applies for me to other ecosystems, like Python.)
I'm sure it's useful for people who are working on nim itself, but speaking personally, choosenim actually has negative utility.
The authors praised Rust debugging capabilities (also) in VScode, can somebody point to some references on this? From what I’ve seen around and what I’m able to debug in my VScode environment I might be missing something…
For Linux, I haven't used VSCode for debugging myself, just `gdb --tui`. Since VSCode supports driving gdb I would think everything would work fine.
For Windows, I used to debug Rust using Visual Studio (not Code) without problems, because Windows binaries built for the windows-msvc target generate regular .pdb files that VS can work with regardless of the source language.
Both VS and gdb have a problem where the representation of Rust values when printed / watched is funky, especially with enums, because the Rust type gets translated into C terms. But it's not unworkable.
> Publishing your app as native AOT produces an app that is self-contained and that has been ahead-of-time (AOT) compiled to native code. Native AOT apps start up very quickly and use less memory. Users of the application can run it on a machine that doesn't have the .NET runtime installed.
I am doing an experiment of trying out C# as a Python replacement for automation because of (1) the strength of the built-in libraries and (2) not requiring an interpreter. Bundling the C# runtime into application is doable, and the tools are super portable with no changes so far between Windows/Mac/Linux.
However, the smallest I've been able to get the standalone executables with bundled stripped down runtimes is ~9 MiB.
Even with this limitation, I've been super happy with this experiment so far. I'm not sure how committed I am yet to C# over Python, but it's pretty breathtaking what all you get out of the box with .NET
Coming from a mostly-Python background myself, I'm curious if you've looked into F# for this. It's succinct like Python, gives you type safety, and you can use .NET libraries.
I've thought about it, but I spent a lot of time going down the rabbit hole with functional languages with Haskell a few years ago and was just checking out the .NET ecosystem as a whole since I wasn't familiar with C#.
The experiment was "How productive and portable is .NET Core?" knowing that F# is there as a possibility. I dabbled in the tangentially similar OCaml a long time ago, but had wanted to just move fast to answer "Is .NET is a reasonable technical choice for cross-platform tooling I need?" So far it's been a pretty solid "Yes."
It's supposedly been possible for quite a while, but every time I talk to someone (granted only a few people who would actually want to use and benefit from it in their software) about it they say "it's missing X" or something of the sort that makes it a total non-starter.
Neither Java nor Python are incredible languages as far as baseline features and semantics are concerned, but they continue to thrive because their ecosystems are fantastic and they both have really rich standard libraries to start from. Contrarily, quite a few languages I love from a language design perspective fail to gain as much traction for lack of well-developed ecosystems.