Skip to content

Evaluation Of Automated Code Documentation Approaches

Notifications You must be signed in to change notification settings

reuterma24/eval-auto-code-docu

Repository files navigation

eval-aut-code-doc

Repository for the bachelor's thesis:
Evaluation Of Automated Code Documentation Approaches


Preprocessing

The preprocessing procedure can be invoked by using preprocess_main.py. This file triggers all preprocessing procedures sequentially.


Training

The training has to be invoked seperately for each approach. Details on how to train each models are available at the authors repositories: attnFc, code2seq, CodeGNN, NeuralLSPS

Some training procedures require additional files than can be easily found at the author's repository for download.


Evaluation

All approaches can be evaluated sequentially with evaluate_main.py. This requires the preprocessed data and the prediction files from the models.


Link to download final model files for each approach: HERE

Note: Some hardcoded paths require adjustment to fit a different directory structure

About

Evaluation Of Automated Code Documentation Approaches

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published