Ah, but you don't always have to copy memory around to facilitate immutable datastructures. Koka lang and Roc lang are at the forefront of a functional-but-in-place (FBIP) style of memory management that use hyper-fast ref-counting and if a function owns the sole reference to a datastructure can mutably modify it instead of deallocating & reallocating.
The most recent Perceus reference counting paper, implemented in Koka, had Koka's functional and persistent Red-Black tree 10% faster than the handwritten C++ version (std::map) [0], and the Roc language has a purely functional quicksort version competitive with standard imperative languages.
Does Perceus differ substantially from using https://lib.rs/crates/im and writing code which looks identical to imperative code, but cloning is faster and mutations are slower (and copy internal structures as necessary)?
It does the same mutate-if-unique optimization as im, but it does so automatically and more broadly as a part of its optimized reference counting. Perceus can go beyond what im can do by re-using allocations between static types - one of their examples involves doing a tree traversal with O(1) extra space by using a zipper which gets optimized by Perceus into a Morris traversal - and the programmer doesn't have to do anything special to get the benefit! (of course, writing code in such a way that you know it will be optimized has performance benefits if you do know how it works) This sets up this wonderful situation where you can write easier-to-maintain and easier-to-verify-correct algorithms that get optimized into very efficient versions that would be hard to write by hand even in a mutable/imperative language.
Their papers and online docs are very good, I highly recommend them for more information!
Both working on making functional programming practical and very fast. Koka is more of a research language, also developing Algebraic Effects and making them practical, which is very cool.
So it seems we can have out cake and eat it too!
[0] - https://www.microsoft.com/en-us/research/uploads/prod/2021/1...