> This is largely mythology built up around a misunderstanding of the grain of truth at the core of it.
I agree. I've encountered a few space leaks in my Haskell code, but it's always been pretty easy to track down: profile the memory usage and look for a conspicuously huge peak. It will usually be fixable by forcing an intermediate result, or using a different recursion strategy (eg. a fold instead of explicit recursion). Haskell functions tend to be so small and single-purpose that there are no knock-on effects of doing this refactoring.
I think there's a perception that 'Haskell is difficult to debug', whereas the reality is more like 'Haskell will reject all of the easy bugs'. In other words, the average difficulty of a Haskell bug may be higher, but the density is less.
Yes. Most recommendations for implementing recursion in Haskell that I've seen have amounted to writing it in terms of folds. You gain the benefit of experts vetting the code, and novices being able to understand what it is doing it a glance.
I agree. I've encountered a few space leaks in my Haskell code, but it's always been pretty easy to track down: profile the memory usage and look for a conspicuously huge peak. It will usually be fixable by forcing an intermediate result, or using a different recursion strategy (eg. a fold instead of explicit recursion). Haskell functions tend to be so small and single-purpose that there are no knock-on effects of doing this refactoring.
I think there's a perception that 'Haskell is difficult to debug', whereas the reality is more like 'Haskell will reject all of the easy bugs'. In other words, the average difficulty of a Haskell bug may be higher, but the density is less.