You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm trying to wrap my head around how I can bulk-delete a collection of metadata I no longer need in my org. The standard approach is of course to delete them in the dev sandbox and retrieve the changes, the package.xml and destructiveChanges.xml will be generated during the CI/CD process, and so will successfully be removed from integration...
However... I'm wanting to delete 80 pieces of metadata, and I would hate to have to go manually remove references to be able to delete custom objects etc. so want to use destructiveChanges.xml without relying on this being generated for me in the CI/CD.
I've used destructiveChanges once now to remove some problematic tests, I did do this just by manually putting it in the file and it seemed to go ok, but subsequent use of the file I'm unsure of. Do I retain those items in the destructiveChanges.xml file and add my ~80 more to it?
Then, when promoting from integration to UAT, is it this destructiveChanges file that informs the pipeline what to then remove from the next level Org (given it is a full deployment by default, not delta)? So, is this file never 'reset' to be blank until changes are promoted all the way to production?
Sorry for asking these basic questions, looking at SGD and also looking at your docs, it assumes a 'happy path' but then also assumes a higher knowledge of what the pipeline will ultimately do, but I can't be certain how to do the "right thing" to keep my pipeline functioning properly.
Maybe with some clarity here, I can then make a PR for updating docs to reflect the deeper knowledge!
Al
The text was updated successfully, but these errors were encountered:
@readeral that's ok to manually update destructiveChanges.xml if you know what you are doing :)
You can let them here indefinitely, because deploy command is called with --ignore-warnings so trying to delete a metadata that has already been deleted won't make future deployments fail :)
If you want to update the documentation with PR, they are more than welcome :)
Hi Nicolas,
I'm trying to wrap my head around how I can bulk-delete a collection of metadata I no longer need in my org. The standard approach is of course to delete them in the dev sandbox and retrieve the changes, the package.xml and destructiveChanges.xml will be generated during the CI/CD process, and so will successfully be removed from integration...
However... I'm wanting to delete 80 pieces of metadata, and I would hate to have to go manually remove references to be able to delete custom objects etc. so want to use destructiveChanges.xml without relying on this being generated for me in the CI/CD.
I've used destructiveChanges once now to remove some problematic tests, I did do this just by manually putting it in the file and it seemed to go ok, but subsequent use of the file I'm unsure of. Do I retain those items in the destructiveChanges.xml file and add my ~80 more to it?
Then, when promoting from integration to UAT, is it this destructiveChanges file that informs the pipeline what to then remove from the next level Org (given it is a full deployment by default, not delta)? So, is this file never 'reset' to be blank until changes are promoted all the way to production?
Sorry for asking these basic questions, looking at SGD and also looking at your docs, it assumes a 'happy path' but then also assumes a higher knowledge of what the pipeline will ultimately do, but I can't be certain how to do the "right thing" to keep my pipeline functioning properly.
Maybe with some clarity here, I can then make a PR for updating docs to reflect the deeper knowledge!
Al
The text was updated successfully, but these errors were encountered: