-
Notifications
You must be signed in to change notification settings - Fork 3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fetch arm64 camera-streamer in GitHub action #10
base: main
Are you sure you want to change the base?
Conversation
This also refactors things a bit to make it easier to include additional packages in the future.
Would it be possible to get this pull request reviewed? I've tried to be careful not to deviate too much from existing style, and if anything doesn't make sense, I'm happy to try and explain. I'm interested in having a 64-bit build in the repository. Thank you in advance! |
Sorry, I had my hands full the past weeks, and right now I'm at a conference. I'll see that I can take care of it next week. |
Ok, I understand. Thanks in advance! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think there's a logic issue, see the comment below. The add-package
workflow likely needs to be adjusted as well to take a list of URLs, not just a single one.
else | ||
echo "Package ${NAMES[i]} not in repo" | ||
echo "geturl=${URLS[i]}" >> "$GITHUB_OUTPUT" | ||
break |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Isn't this going to lead to issues if we find a new version that has two new packages, and thus two new URLs that need to be fetched? geturl
will only ever hold the last URL to fetch, only that will be part of the workflow dispatch below.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I actually did this on purpose in order to simplify the logic. Since the workflow runs every hour, the first hour it will do the first URL, and so on until all are done. The rate of new packages showing up is fairly low, and it didn't seem like there would be harm in not updating the repository all at once.
Do you think it would help to include a comment explaining this?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I would actually prefer to not have more than one trigger per version, as those will cause two separate PRs. That doesn't sound much, but given my TODO list size that always just seems to grow instead of shrinking, less work items created by automation is ALWAYS the preference.
So let's find a solution instead that will only create one PR per version, regardless of the number of included packages.
This also refactors things a bit to make it easier to include additional packages in the future.