top | item 33973297

(no title)

tetsusaiga | 3 years ago

Basically zero effort/intelligence compared to a lot of these, but the business wanted backup records from our CMS (of sorts), because they were gun shy of catastrophic losses after spending like 250 human hours on manual entry the last time. Thing was, the API provided zero means for export, either mass or individual.

Motivated to prove myself, I ended up writing a crawler with all kinds of contingencies for the crap UI, which would press the "Download" button that downloaded each record as a JSON file (couple thousand of these). Then my little node app would shoot the JSON files to an S3 bucket for safekeeping, parse them, and save each record in DynamoDB I believe it was.

Doesn't take a genius to come up with the idea to write a crawler, but no one else did, and the business was entering a state of frantic desperation re: this issue, so I felt pretty smart for a bit.

Then a few months later they abandoned the CMS and all it's corresponding data.

discuss

order

No comments yet.