At one point, someone on buzz was advocating haskell, and he pulled out the usual "look how elegant the fibonacci function looks!" thing. Someone else was doing the "the point must be productivity, and you can't prove it adds more productivity" thing. I think the point is fun, not productivity. I wrote a big long response, but then felt like getting involved in a big language discussion on work time was kind of a waste of time, so didn't bother posting it. And it was mostly going to be a self-indulgent preaching to the choir thing anyway. But then I thought, well, as long as I've written it I might as well put it somewhere. And I don't believe I've noticed anyone saying it in exactly this way before. Anyway, where it is:
I know people like to pull out fibonacci and quicksort, but they don't seem all that compelling to me.
Instead:
Think about the whole thing with iterators or generators or whatever else that so many imperative languages have. That's all gone.
Select vs. threads vs. coroutines is also gone, but that's just a ghc feature. All that flap about asynchronous events and event based programming? Gone.
Think about that whole null pointer thing that all these imperative languages have. Null pointer exceptions? Gone!
Think about that whole thing with reference vs. values. That's gone. Or think about the thing with identity vs. equality vs. deep equality, that's gone too.
All that stuff about copy constructors or equals methods or all the 'this.x = x' constructor garbage, all that boring boilerplate is gone. No more writing toString()s or __str__()s by hand, you can get a useful one for free.
Or how about all that stuff about "const" or "final" or "const correctness" and separate const and non-const methods, that all just goes away and good riddance.
Type casts are gone too.
All that hairy generics stuff is vastly simplified without subtyping. Contravariance vs. covariance, sub/super constraints on generics, multiple vs. single inheritance, the whole is-a vs. has-a thing is gone. Yes it means there's no dynamic dispatch, but I don't miss it much. If you want to parameterize on an operation, just pass the operation as a parameter, instead of passing an object that dynamically dispatches on one of several operations.
Think about all the mandatory type declarations those not-python languages have. That's gone. Or maybe think about how slow and single-threaded and hard to deploy and runtime-error-loving python is, that's gone too.
That whole thing about control structures being built-in syntax and different from functions? That's gone. It's hard to quantify the difference it makes because you stop thinking about the difference between them and just structure your code as feels most natural. Like the whole order of definition thing (that's gone too), it's just one more restriction removed. The whole statement vs. expression thing is gone too.
That lengthy edit, compile, wait, debug cycle is mostly gone too. It usually takes less than a second to reload a module and start running the code you wrote one second ago.
And of course, this is just a list of the things that are gone. There's an equally large list of new things that are there, but it might be hard to see how those are useful from the outside.
I have a project with more than 200 source files, each one defines from 5 to 20-ish data types. If it were java or c++, with their heavyweight class declarations, each one would be a directory with 5 to 20 files in it, about one per data type. No one is going to go to all that work, so the result is that you create types for the big abstractions and pass the rest around as ints and doubles and strings, or Maps and Lists. The result is undescriptive signatures, search and replace expeditions when a type changes, reusing "almost right" types that results in extra "impossible" error cases to deal with, mixed up values, and of course runtime type errors.
Similarly, when defining a new function requires minimum 3 lines of boilerplate and some redundant type declarations that need to be updated whenever your types change and maybe defined half a screen away, you create a lot fewer functions, and factor less. If it's too annoying to do, no one will do it. And if they do, they will do it because they feel obligated to do the right thing, even if it's tedious. Not because they're having a good time.
To me, the main compelling thing is *fun*. All of that stuff which is gone is mostly boring bookkeeping crap. I never get excited thinking about putting in new 'const' declarations or equals() or toString() methods. I don't look forward to all the TypeErrors and NameErrors and ValueErrors I might face today, or waiting for the application to relink. Programming is more fun without that stuff.