I recently wrote a small source code counter and as part of the process I naturally ran some benchmarks to compare to the many tools that already exist. The results were somewhat erratic, but I was quite disappointed with Rust.
Results on the Rust repo:
Tool | Language | Time |
---|---|---|
polyglot |
ATS | 218.3 ms |
loc |
Rust | 139.5 ms |
tokei |
Rust | 333.3 ms |
enry |
Go | 5.183 s |
cloc |
Perl | 16.47 s |
linguist |
Ruby | 16.70 s |
Analysis
Functional paradigms really shine here. Streams in ATS are just plain easier than in Rust
(which, given the state of ATS documentation, says more about Rust than me). The
polyglot
example is even compiled with a garbage collector, which puts just
how slow Rust is into perspective.
The usual response to this is "you can write slow code in any language". Which is of course not the point - Haskell is far more expressive, easier to refactor, and has more safety guarantees. When you remove speed as a consideration, Rust starts looking a lot less attractive.
Moreover, functional programming in general is vindicated - abstraction is an
underrated tool for performance, and here we see just how much we are missing
out on in Rust. tokei
takes longer on four cores that poly
on
one, even using rayon
and the Rust ecosystem! Not only is "functional
programming is slow" false, in fact the opposite is true! I suspect
we will see much more use of formal verification and linear types in performant code
as we go forward.