(no title)
tomkwong | 3 years ago
This story resonates with many people here because many experienced engineers had done something similar before. For me, destructive batch operations like this would be two distinct steps:
1. Identify files that need to be deleted; 2. Loop through the list and delete them one by one.
These steps are decoupled so that the list can be validated. Each step can be tested independently. And the scripts are idempotent and can be reused.
Production operations are always risky. A good practice is to always prepare an execution plan with detailed steps, a validation plan, and a rollback plan. And, review the plan with peers before the operation.
notyourday|3 years ago
> These steps are decoupled so that the list can be validated. Each step can be tested independently. And the scripts are idempotent and can be reused.
This is the most underrated comment.
I'm saying it as someone who had the ultimate oversight of deleting hundreds of TBs per day spread of billions of files on different clouds and local storage.
spiffytech|3 years ago