hercules icon indicating copy to clipboard operation
hercules copied to clipboard

Measure code complexity & use it to estimate `score` per developer

Open EgorBu opened this issue 5 years ago • 2 comments

Hi, 2 points:

  1. measure code complexity per file/class/function & build a distribution of complexities -> long tail will show files that could be considered for refactoring.
  2. average complexity per developer based on his history of contribution -> it will show who tends to write complex code and who is not

EgorBu avatar Jan 18 '19 12:01 EgorBu

@vmarkovtsev Any update on this analysis? I am interested in contributing but I have never worked on go. If you can guide me, I can put some time in figuring this out.

rohit-takhar avatar Jan 11 '20 19:01 rohit-takhar

I haven't even started this. You are welcome to contribute! Don't worry, the killer feature of Go is that you can start writing functional idiomatic code in 1 day or even less.

Here is what I expect:

  • We need to decide which code parsing library to use. Babelfish used to be a great option but it is no longer maintained. The neighbor analysis to collect imports/includes uses TreeSitter - that could be a fair replacement. Otherwise, maybe there are ready-to-use complexity measurement libs out there. This point will define the details of the following one.
  • internal/plumbing/complexity.go - depends on the line diff. We find which functions were modified, which places in the functions were changed, and try to estimate the complexity of those places. There is a ready analysis that already tells you which functions are new or changed based on Babelfish/UAST, but I am not sure we should use it now.
  • leaves/complexity_score.go - that guy depends on the new complexity calculator and records the complexity scores per developer. It depends on the Identity analysis.

vmarkovtsev avatar Jan 12 '20 07:01 vmarkovtsev