-
Notifications
You must be signed in to change notification settings - Fork 305
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
publish release artifact containing only proving keys #4130
Comments
For posterity, this appears to be the script: https://github.com/penumbra-zone/web/blob/71b71e82bb02a4aa1f8bed7ebf8a92d5757d251e/apps/extension/src/utils/download-proving-keys.ts
Oh, I feel that! You're right, we've been optimistic about not making breaking changes. The web workflows in particular bear the brunt of that. At the risk of sounding out of touch, I suggest that we don't expect these changes to happen frequently, and after mainnet, I think it's safe to say they'll be stable. So, the churn is mostly 11th hour development alignment. Not dismissing the concern, just trying to describe my perspective so we can talk pragmatically about requirements.
GH already automatically posts a tarball of the source code for each tagged release, but unfortunately it only bundles lfs pointers by default, which is rather silly. Based on the docs, I've enabled opt-in inclusion of lfs files for source archives, so on the next tagged release, there'll be in there. It might be more straightforward to download the tarball and copy specific files from that—however, given that the script you've got is already written, it feels "good enough" to me for right now.
On this count, downloading the tarball with included lfs files provides a better provenance story than pulling "raw" objects from the gh api. However, what you're pointing to is a general lack of a signing authority for verifying code integrity. These artifacts are built in CI, so publishing checksums created from CI alongside them doesn't provide any meaningful security. For paranoid folks, we assume they'll build from source—but that effectively means trusting the clone of the repo they got in the first place. In the context of assurance, what we care about is that the ones and zeroes are not tampered with from developer workstation -> monorepo -> web repo -> web store -> end user devices. There are a few places we can try to improve integrity, as part of a broader effort for a pre-launch security review. |
part the validation concern is not just cryptographic provenance, but just a very practical "do i have the correct keys, and are they whole" as we change between branches, restore from caches, and bundle files during development |
We discussed this at backlog grooming this week, and consensus was, unsurprisingly: we're totally for-real about to stabilize the keys. 😇 There's a very real forcing function here: at conclusion of phase 2 of the summoning ceremony, we'll have "final" key material that we cannot change without rerunning the ceremony, which we definitely don't want to do. When the time comes to update those keys one last time—probably a few weeks from now—feel free to tag me in on review. I'm happy to perform a tediously manual review to triple-check we've got the right key material where we expect it to be. |
The more things change.... Turns out we changed the keys again as part of #4239. Just goes to show that @turbocrime was indeed right in being skeptical about not having to do the manual key-ferrying again. Plans still in the works about rerunning the summoning ceremony phase 2, so for now, we're still committed to updating the keys in the web code via scripts. There is good news at least for more checksums and such in #4195. I hope to get that together in time for the next published release (i.e. |
a published artefact containing only the proving keys would simplify development and building and dependency navigation for any software that wants to interact with penumbra
yes, i know these aren't supposed to change. they have changed several times since the first day i was told this
currently, anyone using the wasm package runs a script that ad-hoc downloads the keys by manipulating URIs based on parameters such as git tag and a manually maintained mapping of key names
and it's not clear how to correctly/reliably validate the output of that download script. a provided checksum or other endorsed validation method would be nice
also, it makes sense to simply publish these large output files through the infrastructure that github provides for publishing large output files
The text was updated successfully, but these errors were encountered: