-
Notifications
You must be signed in to change notification settings - Fork 178
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Unable to Work with Temporary AWS Security Credentials #1514
Comments
Is there a reason you can't used IAM credentials (which autorenew)? (Can be configured in the datacube.conf config file or via environment variables.) |
they only auto-renew when using Proper solution will require introducing "shared state" for |
There's several types of AWS Credentials. What I'm interested in now is using the AWS AssumeRoleWithWebIdentity, which is similar to OIDC. Support was added to GDAL in 3.6 (November 2022). KK:
This used to be the case, but I think it was fixed in 3.1.0.
KK:
I'm not sure about how rasterio or ODC fits in, but GDAL since 3.5 has support for using a configuration file to define per path prefix credentials or options. |
should have said "the way we put those into GDAL using |
Looks like rasterio's AWSSession is not aware of Maybe raise an issue in In datacube custom Session would be plugged in here based on some config setting: datacube-core/datacube/utils/rio/_rio.py Line 90 in f3323b9
|
Thanks for the pointers! |
Expected behaviour
We should be able to use temporary/expiring/refreshing AWS security credentials while running ODC code. Eg. via the AWS AssumeRoleWithWebIdentity - AWS Security Token Service API call.
This can handled automatically by boto3.
Actual behaviour
ODC code accessing AWS APIs (like S3) work initially when the correct environment variables are set, but start and continue to fail once the credentials expire, which for OIDC/WebIdentityProvider defaults to 2 hours. They are never renewed.
This is inadequate for long processing jobs and for server applications.
More details
There is a comment https://github.com/opendatacube/datacube-core/blob/develop/datacube/utils/aws/__init__.py#L468-L472 indicating that this is known behaviour when using
datacube.utlis.aws.configure_s3_access()
.Fixing this may be as simple as removing most of the custom AWS setup code we have... as I believe some of it is no longer required with better support of AWS in GDAL and rasterio.
Environment information
Which
datacube --version
are you using?1.8.17
What datacube deployment/enviornment are you running against?
@benjimin
The text was updated successfully, but these errors were encountered: