No, man. It's certainly useful to read a developer's commit history. But at they end of the day they're just as silly and misleading as any other metric.
The only metric that matters is business value generated. A really valuable developer might be able to make important strategic contributions to a project through just a handful of commits. While a particularly weak (or negative-performing) committer might be making zillions of commits, but actually doing irreparable damage to the project before anyone notices.
Those activity charts might make good eye candy, but if you don't know your developers or understand their working styles, they just won't tell you that much.
(OP here) Oh, I'm must have not been very clear on the purpose behind tracking commits vs. time - for accounting purposes when recording what time gets capitalized and what time gets expensed. As for misleading, when we matched up commits to time spent per project, they lined up incredibly well, as mentioned in the article.
You're correct that business value is a crucial metric, but that's applied across departments and largely for strategic reasons. This is managed & handled entirely separate from how developer time is translated into accounting practices, which is what my post was about.
The only metric that matters is business value generated. A really valuable developer might be able to make important strategic contributions to a project through just a handful of commits. While a particularly weak (or negative-performing) committer might be making zillions of commits, but actually doing irreparable damage to the project before anyone notices.
Those activity charts might make good eye candy, but if you don't know your developers or understand their working styles, they just won't tell you that much.