Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Split feedback_pipeline into multiple files and added a config manager #77

Merged
merged 2 commits into from
Nov 25, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .github/workflows/docker-image.yml
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@ jobs:
run: mkdir -p output/history

- name: Run the build
run: ./feedback_pipeline.py --dev-buildroot test_configs output
run: ./content_resolver.py --dev-buildroot test_configs output

- name: Run tests
run: |
Expand Down
6 changes: 3 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -106,7 +106,7 @@ Please use the `refresh.sh` script as a reference for deployment.

## Developer preview

If you want to contribute and test your changes, run the `feedback_pipeline.py` script with test configs in the `test_configs` directory.
If you want to contribute and test your changes, run the `content_resolver.py` script with test configs in the `test_configs` directory.

To run the script, you'll need Python 3 and the following dependencies:

Expand All @@ -124,7 +124,7 @@ $ podman run --rm -it --cap-add CAP_SYS_CHROOT --tmpfs /dnf_cachedir -v $(pwd):/

```
# mkdir -p output/history
# ./feedback_pipeline.py --dev-buildroot --dnf-cache-dir /dnf_cachedir test_configs output
# ./content_resolver.py --dev-buildroot --dnf-cache-dir /dnf_cachedir test_configs output
```

The output will be generated in the `output` directory. Open the `output/index.html` in your web browser of choice to see the result.
Expand All @@ -140,7 +140,7 @@ $ docker run --rm -it --tmpfs /dnf_cachedir -v $(pwd):/workspace content-resolve

```
# mkdir -p output/history
# ./feedback_pipeline.py --dev-buildroot --dnf-cache-dir /dnf_cachedir test_configs output
# ./content_resolver.py --dev-buildroot --dnf-cache-dir /dnf_cachedir test_configs output
```

The output will be generated in the `output` directory. Open the `output/index.html` in your web browser of choice to see the result.
122 changes: 122 additions & 0 deletions content_resolver.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,122 @@
#!/usr/bin/python3

import datetime
from content_resolver.analyzer import Analyzer
from content_resolver.data_generation import generate_data_files
from content_resolver.historia_data import generate_historic_data
from content_resolver.page_generation import generate_pages
from content_resolver.query import Query
from content_resolver.utils import load_data, log, datetime_now_string, dump_data
from content_resolver.config_manager import ConfigManager



# Features of this new release
# - multiarch from the ground up!
# - more resilient
# - better internal data structure
# - user-defined views


###############################################################################
### Help ######################################################################
###############################################################################


# Configs:
# TYPE: KEY: ID:
# - repo repos repo_id
# - env_conf envs env_id
# - workload_conf workloads workload_id
# - label labels label_id
# - conf_view views view_id
#
# Data:
# TYPE: KEY: ID:
# - pkg pkgs/repo_id/arch NEVR
# - env envs env_id:repo_id:arch_id
# - workload workloads workload_id:env_id:repo_id:arch_id
# - view views view_id:repo_id:arch_id
#
#
#




###############################################################################
### Main ######################################################################
###############################################################################


def main():

# -------------------------------------------------
# Stage 1: Data collection and analysis using DNF
# -------------------------------------------------

# measuring time of execution
time_started = datetime_now_string()
config_manager = ConfigManager()

settings = config_manager.settings

settings["global_refresh_time_started"] = datetime.datetime.now().strftime("%-d %B %Y %H:%M UTC")



if settings["use_cache"]:
configs = load_data("cache_configs.json")
data = load_data("cache_data.json")
else:
configs = config_manager.get_configs()
analyzer = Analyzer(configs, settings)
data = analyzer.analyze_things()

if settings["dev_buildroot"]:
dump_data("cache_configs.json", configs)
dump_data("cache_data.json", data)



# measuring time of execution
time_analysis_time = datetime_now_string()


# -------------------------------------------------
# Stage 2: Generating pages and data outputs
# -------------------------------------------------

query = Query(data, configs, settings)

generate_pages(query)
generate_data_files(query)
generate_historic_data(query)


# -------------------------------------------------
# Done! Printing final summary
# -------------------------------------------------

# measuring time of execution
time_ended = datetime_now_string()

# Print extra metrics
if not settings["use_cache"]:
analyzer.print_metrics()

# Print base metrics
log("")
log("=============================")
log("Feedback Pipeline build done!")
log("=============================")
log("")
log(" Started: {}".format(time_started))
log(" Analysis done: {}".format(time_analysis_time))
log(" Finished: {}".format(time_ended))
log("")



if __name__ == "__main__":
main()
Empty file added content_resolver/__init__.py
Empty file.
Loading